00:00:00.001 Started by upstream project "autotest-per-patch" build number 126125 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.032 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.032 The recommended git tool is: git 00:00:00.033 using credential 00000000-0000-0000-0000-000000000002 00:00:00.036 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.057 Fetching changes from the remote Git repository 00:00:00.061 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.101 Using shallow fetch with depth 1 00:00:00.101 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.101 > git --version # timeout=10 00:00:00.163 > git --version # 'git version 2.39.2' 00:00:00.163 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.221 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.221 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.915 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.930 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.945 Checking out Revision 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d (FETCH_HEAD) 00:00:02.945 > git config core.sparsecheckout # timeout=10 00:00:02.958 > git read-tree -mu HEAD # timeout=10 00:00:02.976 > git checkout -f 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=5 00:00:02.995 Commit message: "inventory: add WCP3 to free inventory" 00:00:02.996 > git rev-list --no-walk 9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d # timeout=10 00:00:03.082 [Pipeline] Start of Pipeline 00:00:03.094 [Pipeline] library 00:00:03.096 Loading library shm_lib@master 00:00:03.096 Library shm_lib@master is cached. Copying from home. 00:00:03.110 [Pipeline] node 00:00:03.120 Running on CYP6 in /var/jenkins/workspace/crypto-phy-autotest 00:00:03.122 [Pipeline] { 00:00:03.129 [Pipeline] catchError 00:00:03.130 [Pipeline] { 00:00:03.139 [Pipeline] wrap 00:00:03.145 [Pipeline] { 00:00:03.150 [Pipeline] stage 00:00:03.152 [Pipeline] { (Prologue) 00:00:03.312 [Pipeline] sh 00:00:03.593 + logger -p user.info -t JENKINS-CI 00:00:03.610 [Pipeline] echo 00:00:03.611 Node: CYP6 00:00:03.618 [Pipeline] sh 00:00:03.912 [Pipeline] setCustomBuildProperty 00:00:03.922 [Pipeline] echo 00:00:03.923 Cleanup processes 00:00:03.926 [Pipeline] sh 00:00:04.204 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.204 2321137 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.216 [Pipeline] sh 00:00:04.499 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.499 ++ grep -v 'sudo pgrep' 00:00:04.499 ++ awk '{print $1}' 00:00:04.499 + sudo kill -9 00:00:04.499 + true 00:00:04.512 [Pipeline] cleanWs 00:00:04.520 [WS-CLEANUP] Deleting project workspace... 00:00:04.520 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.527 [WS-CLEANUP] done 00:00:04.531 [Pipeline] setCustomBuildProperty 00:00:04.545 [Pipeline] sh 00:00:04.826 + sudo git config --global --replace-all safe.directory '*' 00:00:04.892 [Pipeline] httpRequest 00:00:04.917 [Pipeline] echo 00:00:04.918 Sorcerer 10.211.164.101 is alive 00:00:04.924 [Pipeline] httpRequest 00:00:04.928 HttpMethod: GET 00:00:04.929 URL: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.930 Sending request to url: http://10.211.164.101/packages/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:04.941 Response Code: HTTP/1.1 200 OK 00:00:04.942 Success: Status code 200 is in the accepted range: 200,404 00:00:04.942 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:05.860 [Pipeline] sh 00:00:06.141 + tar --no-same-owner -xf jbp_9bf0dabeadcf84e29a3d5dbec2430e38aceadf8d.tar.gz 00:00:06.156 [Pipeline] httpRequest 00:00:06.172 [Pipeline] echo 00:00:06.173 Sorcerer 10.211.164.101 is alive 00:00:06.181 [Pipeline] httpRequest 00:00:06.186 HttpMethod: GET 00:00:06.186 URL: http://10.211.164.101/packages/spdk_be7837808965c985c491e37cab2b42d513617364.tar.gz 00:00:06.187 Sending request to url: http://10.211.164.101/packages/spdk_be7837808965c985c491e37cab2b42d513617364.tar.gz 00:00:06.206 Response Code: HTTP/1.1 200 OK 00:00:06.206 Success: Status code 200 is in the accepted range: 200,404 00:00:06.207 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_be7837808965c985c491e37cab2b42d513617364.tar.gz 00:02:32.484 [Pipeline] sh 00:02:32.768 + tar --no-same-owner -xf spdk_be7837808965c985c491e37cab2b42d513617364.tar.gz 00:02:36.150 [Pipeline] sh 00:02:36.438 + git -C spdk log --oneline -n5 00:02:36.438 be7837808 bdev/nvme: show `numa_socket_id` for bdev_nvme_get_controllers 00:02:36.438 cf710e481 nvme: populate socket_id for rdma controllers 00:02:36.438 f1ebf4106 nvme: populate socket_id for tcp controllers 00:02:36.438 41c6d27b6 nvme: populate socket_id for pcie controllers 00:02:36.438 a7de6acf1 nvme: add spdk_nvme_ctrlr_get_socket_id() 00:02:36.451 [Pipeline] } 00:02:36.470 [Pipeline] // stage 00:02:36.480 [Pipeline] stage 00:02:36.482 [Pipeline] { (Prepare) 00:02:36.502 [Pipeline] writeFile 00:02:36.520 [Pipeline] sh 00:02:36.807 + logger -p user.info -t JENKINS-CI 00:02:36.821 [Pipeline] sh 00:02:37.110 + logger -p user.info -t JENKINS-CI 00:02:37.125 [Pipeline] sh 00:02:37.413 + cat autorun-spdk.conf 00:02:37.413 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:37.413 SPDK_TEST_BLOCKDEV=1 00:02:37.413 SPDK_TEST_ISAL=1 00:02:37.413 SPDK_TEST_CRYPTO=1 00:02:37.413 SPDK_TEST_REDUCE=1 00:02:37.413 SPDK_TEST_VBDEV_COMPRESS=1 00:02:37.414 SPDK_RUN_UBSAN=1 00:02:37.422 RUN_NIGHTLY=0 00:02:37.428 [Pipeline] readFile 00:02:37.459 [Pipeline] withEnv 00:02:37.461 [Pipeline] { 00:02:37.475 [Pipeline] sh 00:02:37.765 + set -ex 00:02:37.765 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:02:37.765 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:37.765 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:37.765 ++ SPDK_TEST_BLOCKDEV=1 00:02:37.765 ++ SPDK_TEST_ISAL=1 00:02:37.765 ++ SPDK_TEST_CRYPTO=1 00:02:37.765 ++ SPDK_TEST_REDUCE=1 00:02:37.765 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:37.765 ++ SPDK_RUN_UBSAN=1 00:02:37.765 ++ RUN_NIGHTLY=0 00:02:37.765 + case $SPDK_TEST_NVMF_NICS in 00:02:37.765 + DRIVERS= 00:02:37.765 + [[ -n '' ]] 00:02:37.765 + exit 0 00:02:37.777 [Pipeline] } 00:02:37.797 [Pipeline] // withEnv 00:02:37.803 [Pipeline] } 00:02:37.820 [Pipeline] // stage 00:02:37.829 [Pipeline] catchError 00:02:37.831 [Pipeline] { 00:02:37.847 [Pipeline] timeout 00:02:37.848 Timeout set to expire in 40 min 00:02:37.849 [Pipeline] { 00:02:37.862 [Pipeline] stage 00:02:37.864 [Pipeline] { (Tests) 00:02:37.877 [Pipeline] sh 00:02:38.167 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:02:38.168 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:02:38.168 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:02:38.168 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:02:38.168 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:38.168 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:02:38.168 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:02:38.168 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:38.168 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:02:38.168 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:02:38.168 + [[ crypto-phy-autotest == pkgdep-* ]] 00:02:38.168 + cd /var/jenkins/workspace/crypto-phy-autotest 00:02:38.168 + source /etc/os-release 00:02:38.168 ++ NAME='Fedora Linux' 00:02:38.168 ++ VERSION='38 (Cloud Edition)' 00:02:38.168 ++ ID=fedora 00:02:38.168 ++ VERSION_ID=38 00:02:38.168 ++ VERSION_CODENAME= 00:02:38.168 ++ PLATFORM_ID=platform:f38 00:02:38.168 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:02:38.168 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:38.168 ++ LOGO=fedora-logo-icon 00:02:38.168 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:02:38.168 ++ HOME_URL=https://fedoraproject.org/ 00:02:38.168 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:02:38.168 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:38.168 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:38.168 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:38.168 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:02:38.168 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:38.168 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:02:38.168 ++ SUPPORT_END=2024-05-14 00:02:38.168 ++ VARIANT='Cloud Edition' 00:02:38.168 ++ VARIANT_ID=cloud 00:02:38.168 + uname -a 00:02:38.168 Linux spdk-CYP-06 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:02:38.168 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:02:41.468 Hugepages 00:02:41.468 node hugesize free / total 00:02:41.468 node0 1048576kB 0 / 0 00:02:41.468 node0 2048kB 0 / 0 00:02:41.468 node1 1048576kB 0 / 0 00:02:41.728 node1 2048kB 0 / 0 00:02:41.728 00:02:41.728 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:41.728 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:02:41.728 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:02:41.728 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:02:41.728 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:02:41.728 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:02:41.728 + rm -f /tmp/spdk-ld-path 00:02:41.728 + source autorun-spdk.conf 00:02:41.728 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:41.728 ++ SPDK_TEST_BLOCKDEV=1 00:02:41.728 ++ SPDK_TEST_ISAL=1 00:02:41.728 ++ SPDK_TEST_CRYPTO=1 00:02:41.728 ++ SPDK_TEST_REDUCE=1 00:02:41.728 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:02:41.728 ++ SPDK_RUN_UBSAN=1 00:02:41.728 ++ RUN_NIGHTLY=0 00:02:41.728 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:41.728 + [[ -n '' ]] 00:02:41.728 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:41.728 + for M in /var/spdk/build-*-manifest.txt 00:02:41.728 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:41.728 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:41.728 + for M in /var/spdk/build-*-manifest.txt 00:02:41.728 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:41.728 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:02:41.728 ++ uname 00:02:41.728 + [[ Linux == \L\i\n\u\x ]] 00:02:41.728 + sudo dmesg -T 00:02:41.989 + sudo dmesg --clear 00:02:41.989 + dmesg_pid=2322349 00:02:41.989 + [[ Fedora Linux == FreeBSD ]] 00:02:41.989 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:41.989 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:41.989 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:41.989 + [[ -x /usr/src/fio-static/fio ]] 00:02:41.989 + export FIO_BIN=/usr/src/fio-static/fio 00:02:41.989 + FIO_BIN=/usr/src/fio-static/fio 00:02:41.989 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:41.989 + sudo dmesg -Tw 00:02:41.989 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:41.989 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:41.989 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:41.989 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:41.989 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:41.989 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:41.989 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:41.989 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:02:41.989 Test configuration: 00:02:41.989 SPDK_RUN_FUNCTIONAL_TEST=1 00:02:41.989 SPDK_TEST_BLOCKDEV=1 00:02:41.989 SPDK_TEST_ISAL=1 00:02:41.989 SPDK_TEST_CRYPTO=1 00:02:41.989 SPDK_TEST_REDUCE=1 00:02:41.989 SPDK_TEST_VBDEV_COMPRESS=1 00:02:41.989 SPDK_RUN_UBSAN=1 00:02:41.989 RUN_NIGHTLY=0 15:39:02 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:41.989 15:39:02 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:41.989 15:39:02 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:41.989 15:39:02 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:41.989 15:39:02 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.989 15:39:02 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.989 15:39:02 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.989 15:39:02 -- paths/export.sh@5 -- $ export PATH 00:02:41.989 15:39:02 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:41.989 15:39:02 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:41.990 15:39:02 -- common/autobuild_common.sh@444 -- $ date +%s 00:02:41.990 15:39:02 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720791542.XXXXXX 00:02:41.990 15:39:02 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720791542.6ohywy 00:02:41.990 15:39:02 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:02:41.990 15:39:02 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:02:41.990 15:39:02 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:02:41.990 15:39:02 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:02:41.990 15:39:02 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:02:41.990 15:39:02 -- common/autobuild_common.sh@460 -- $ get_config_params 00:02:41.990 15:39:02 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:02:41.990 15:39:02 -- common/autotest_common.sh@10 -- $ set +x 00:02:41.990 15:39:02 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:02:41.990 15:39:02 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:02:41.990 15:39:02 -- pm/common@17 -- $ local monitor 00:02:41.990 15:39:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:41.990 15:39:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:41.990 15:39:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:41.990 15:39:02 -- pm/common@21 -- $ date +%s 00:02:41.990 15:39:02 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:41.990 15:39:02 -- pm/common@25 -- $ sleep 1 00:02:41.990 15:39:02 -- pm/common@21 -- $ date +%s 00:02:41.990 15:39:02 -- pm/common@21 -- $ date +%s 00:02:41.990 15:39:02 -- pm/common@21 -- $ date +%s 00:02:41.990 15:39:02 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720791542 00:02:41.990 15:39:02 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720791542 00:02:41.990 15:39:02 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720791542 00:02:41.990 15:39:02 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1720791542 00:02:42.250 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720791542_collect-vmstat.pm.log 00:02:42.250 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720791542_collect-cpu-load.pm.log 00:02:42.250 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720791542_collect-cpu-temp.pm.log 00:02:42.250 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1720791542_collect-bmc-pm.bmc.pm.log 00:02:43.190 15:39:03 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:02:43.190 15:39:03 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:43.190 15:39:03 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:43.190 15:39:03 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:43.190 15:39:03 -- spdk/autobuild.sh@16 -- $ date -u 00:02:43.190 Fri Jul 12 01:39:03 PM UTC 2024 00:02:43.190 15:39:03 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:43.190 v24.09-pre-226-gbe7837808 00:02:43.190 15:39:03 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:02:43.190 15:39:03 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:43.190 15:39:03 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:43.190 15:39:03 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:02:43.190 15:39:03 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:02:43.190 15:39:03 -- common/autotest_common.sh@10 -- $ set +x 00:02:43.190 ************************************ 00:02:43.190 START TEST ubsan 00:02:43.190 ************************************ 00:02:43.190 15:39:03 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:02:43.190 using ubsan 00:02:43.190 00:02:43.190 real 0m0.001s 00:02:43.190 user 0m0.001s 00:02:43.190 sys 0m0.000s 00:02:43.190 15:39:03 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:02:43.190 15:39:03 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:43.190 ************************************ 00:02:43.190 END TEST ubsan 00:02:43.190 ************************************ 00:02:43.190 15:39:03 -- common/autotest_common.sh@1142 -- $ return 0 00:02:43.190 15:39:03 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:02:43.190 15:39:03 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:43.190 15:39:03 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:02:43.190 15:39:03 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:02:43.190 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:02:43.190 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:43.761 Using 'verbs' RDMA provider 00:02:59.620 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:11.904 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:11.904 Creating mk/config.mk...done. 00:03:11.904 Creating mk/cc.flags.mk...done. 00:03:11.904 Type 'make' to build. 00:03:11.904 15:39:32 -- spdk/autobuild.sh@69 -- $ run_test make make -j128 00:03:11.904 15:39:32 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:03:11.904 15:39:32 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:03:11.904 15:39:32 -- common/autotest_common.sh@10 -- $ set +x 00:03:11.904 ************************************ 00:03:11.904 START TEST make 00:03:11.904 ************************************ 00:03:11.904 15:39:32 make -- common/autotest_common.sh@1123 -- $ make -j128 00:03:12.475 make[1]: Nothing to be done for 'all'. 00:03:44.613 The Meson build system 00:03:44.613 Version: 1.3.1 00:03:44.613 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:03:44.613 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:03:44.613 Build type: native build 00:03:44.613 Program cat found: YES (/usr/bin/cat) 00:03:44.613 Project name: DPDK 00:03:44.613 Project version: 24.03.0 00:03:44.613 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:03:44.613 C linker for the host machine: cc ld.bfd 2.39-16 00:03:44.613 Host machine cpu family: x86_64 00:03:44.613 Host machine cpu: x86_64 00:03:44.613 Message: ## Building in Developer Mode ## 00:03:44.613 Program pkg-config found: YES (/usr/bin/pkg-config) 00:03:44.613 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:03:44.613 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:03:44.613 Program python3 found: YES (/usr/bin/python3) 00:03:44.613 Program cat found: YES (/usr/bin/cat) 00:03:44.613 Compiler for C supports arguments -march=native: YES 00:03:44.613 Checking for size of "void *" : 8 00:03:44.613 Checking for size of "void *" : 8 (cached) 00:03:44.613 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:03:44.613 Library m found: YES 00:03:44.613 Library numa found: YES 00:03:44.613 Has header "numaif.h" : YES 00:03:44.613 Library fdt found: NO 00:03:44.613 Library execinfo found: NO 00:03:44.613 Has header "execinfo.h" : YES 00:03:44.613 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:03:44.613 Run-time dependency libarchive found: NO (tried pkgconfig) 00:03:44.613 Run-time dependency libbsd found: NO (tried pkgconfig) 00:03:44.613 Run-time dependency jansson found: NO (tried pkgconfig) 00:03:44.613 Run-time dependency openssl found: YES 3.0.9 00:03:44.613 Run-time dependency libpcap found: YES 1.10.4 00:03:44.613 Has header "pcap.h" with dependency libpcap: YES 00:03:44.613 Compiler for C supports arguments -Wcast-qual: YES 00:03:44.613 Compiler for C supports arguments -Wdeprecated: YES 00:03:44.613 Compiler for C supports arguments -Wformat: YES 00:03:44.613 Compiler for C supports arguments -Wformat-nonliteral: NO 00:03:44.613 Compiler for C supports arguments -Wformat-security: NO 00:03:44.613 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:44.613 Compiler for C supports arguments -Wmissing-prototypes: YES 00:03:44.613 Compiler for C supports arguments -Wnested-externs: YES 00:03:44.613 Compiler for C supports arguments -Wold-style-definition: YES 00:03:44.613 Compiler for C supports arguments -Wpointer-arith: YES 00:03:44.613 Compiler for C supports arguments -Wsign-compare: YES 00:03:44.613 Compiler for C supports arguments -Wstrict-prototypes: YES 00:03:44.613 Compiler for C supports arguments -Wundef: YES 00:03:44.613 Compiler for C supports arguments -Wwrite-strings: YES 00:03:44.613 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:03:44.613 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:03:44.613 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:44.613 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:03:44.613 Program objdump found: YES (/usr/bin/objdump) 00:03:44.613 Compiler for C supports arguments -mavx512f: YES 00:03:44.613 Checking if "AVX512 checking" compiles: YES 00:03:44.613 Fetching value of define "__SSE4_2__" : 1 00:03:44.613 Fetching value of define "__AES__" : 1 00:03:44.613 Fetching value of define "__AVX__" : 1 00:03:44.613 Fetching value of define "__AVX2__" : 1 00:03:44.613 Fetching value of define "__AVX512BW__" : 1 00:03:44.613 Fetching value of define "__AVX512CD__" : 1 00:03:44.613 Fetching value of define "__AVX512DQ__" : 1 00:03:44.613 Fetching value of define "__AVX512F__" : 1 00:03:44.613 Fetching value of define "__AVX512VL__" : 1 00:03:44.613 Fetching value of define "__PCLMUL__" : 1 00:03:44.613 Fetching value of define "__RDRND__" : 1 00:03:44.613 Fetching value of define "__RDSEED__" : 1 00:03:44.613 Fetching value of define "__VPCLMULQDQ__" : 1 00:03:44.613 Fetching value of define "__znver1__" : (undefined) 00:03:44.613 Fetching value of define "__znver2__" : (undefined) 00:03:44.613 Fetching value of define "__znver3__" : (undefined) 00:03:44.613 Fetching value of define "__znver4__" : (undefined) 00:03:44.613 Compiler for C supports arguments -Wno-format-truncation: YES 00:03:44.613 Message: lib/log: Defining dependency "log" 00:03:44.613 Message: lib/kvargs: Defining dependency "kvargs" 00:03:44.613 Message: lib/telemetry: Defining dependency "telemetry" 00:03:44.613 Checking for function "getentropy" : NO 00:03:44.613 Message: lib/eal: Defining dependency "eal" 00:03:44.613 Message: lib/ring: Defining dependency "ring" 00:03:44.613 Message: lib/rcu: Defining dependency "rcu" 00:03:44.613 Message: lib/mempool: Defining dependency "mempool" 00:03:44.613 Message: lib/mbuf: Defining dependency "mbuf" 00:03:44.613 Fetching value of define "__PCLMUL__" : 1 (cached) 00:03:44.613 Fetching value of define "__AVX512F__" : 1 (cached) 00:03:44.613 Fetching value of define "__AVX512BW__" : 1 (cached) 00:03:44.613 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:03:44.613 Fetching value of define "__AVX512VL__" : 1 (cached) 00:03:44.613 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:03:44.613 Compiler for C supports arguments -mpclmul: YES 00:03:44.613 Compiler for C supports arguments -maes: YES 00:03:44.613 Compiler for C supports arguments -mavx512f: YES (cached) 00:03:44.613 Compiler for C supports arguments -mavx512bw: YES 00:03:44.613 Compiler for C supports arguments -mavx512dq: YES 00:03:44.613 Compiler for C supports arguments -mavx512vl: YES 00:03:44.613 Compiler for C supports arguments -mvpclmulqdq: YES 00:03:44.613 Compiler for C supports arguments -mavx2: YES 00:03:44.613 Compiler for C supports arguments -mavx: YES 00:03:44.613 Message: lib/net: Defining dependency "net" 00:03:44.613 Message: lib/meter: Defining dependency "meter" 00:03:44.613 Message: lib/ethdev: Defining dependency "ethdev" 00:03:44.613 Message: lib/pci: Defining dependency "pci" 00:03:44.613 Message: lib/cmdline: Defining dependency "cmdline" 00:03:44.613 Message: lib/hash: Defining dependency "hash" 00:03:44.613 Message: lib/timer: Defining dependency "timer" 00:03:44.613 Message: lib/compressdev: Defining dependency "compressdev" 00:03:44.613 Message: lib/cryptodev: Defining dependency "cryptodev" 00:03:44.613 Message: lib/dmadev: Defining dependency "dmadev" 00:03:44.613 Compiler for C supports arguments -Wno-cast-qual: YES 00:03:44.613 Message: lib/power: Defining dependency "power" 00:03:44.613 Message: lib/reorder: Defining dependency "reorder" 00:03:44.613 Message: lib/security: Defining dependency "security" 00:03:44.613 Has header "linux/userfaultfd.h" : YES 00:03:44.613 Has header "linux/vduse.h" : YES 00:03:44.613 Message: lib/vhost: Defining dependency "vhost" 00:03:44.614 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:03:44.614 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:03:44.614 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:03:44.614 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:03:44.614 Compiler for C supports arguments -std=c11: YES 00:03:44.614 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:03:44.614 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:03:44.614 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:03:44.614 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:03:44.614 Run-time dependency libmlx5 found: YES 1.24.44.0 00:03:44.614 Run-time dependency libibverbs found: YES 1.14.44.0 00:03:44.614 Library mtcr_ul found: NO 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:03:44.614 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:03:45.557 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:03:45.557 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:03:45.557 Configuring mlx5_autoconf.h using configuration 00:03:45.557 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:03:45.557 Run-time dependency libcrypto found: YES 3.0.9 00:03:45.557 Library IPSec_MB found: YES 00:03:45.557 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:03:45.557 Message: drivers/common/qat: Defining dependency "common_qat" 00:03:45.557 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:03:45.557 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:03:45.557 Library IPSec_MB found: YES 00:03:45.557 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:03:45.557 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:03:45.557 Compiler for C supports arguments -std=c11: YES (cached) 00:03:45.557 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:45.557 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:03:45.557 Run-time dependency libisal found: NO (tried pkgconfig) 00:03:45.557 Library libisal found: NO 00:03:45.557 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:03:45.557 Compiler for C supports arguments -std=c11: YES (cached) 00:03:45.557 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:03:45.557 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:03:45.557 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:03:45.557 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:03:45.557 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:03:45.557 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:03:45.557 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:03:45.557 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:03:45.557 Program doxygen found: YES (/usr/bin/doxygen) 00:03:45.557 Configuring doxy-api-html.conf using configuration 00:03:45.557 Configuring doxy-api-man.conf using configuration 00:03:45.557 Program mandb found: YES (/usr/bin/mandb) 00:03:45.557 Program sphinx-build found: NO 00:03:45.557 Configuring rte_build_config.h using configuration 00:03:45.557 Message: 00:03:45.557 ================= 00:03:45.557 Applications Enabled 00:03:45.557 ================= 00:03:45.557 00:03:45.557 apps: 00:03:45.557 00:03:45.557 00:03:45.557 Message: 00:03:45.557 ================= 00:03:45.557 Libraries Enabled 00:03:45.557 ================= 00:03:45.557 00:03:45.557 libs: 00:03:45.557 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:03:45.557 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:03:45.557 cryptodev, dmadev, power, reorder, security, vhost, 00:03:45.557 00:03:45.557 Message: 00:03:45.557 =============== 00:03:45.557 Drivers Enabled 00:03:45.557 =============== 00:03:45.557 00:03:45.557 common: 00:03:45.557 mlx5, qat, 00:03:45.557 bus: 00:03:45.557 auxiliary, pci, vdev, 00:03:45.557 mempool: 00:03:45.557 ring, 00:03:45.557 dma: 00:03:45.557 00:03:45.557 net: 00:03:45.557 00:03:45.557 crypto: 00:03:45.557 ipsec_mb, mlx5, 00:03:45.557 compress: 00:03:45.557 isal, mlx5, 00:03:45.557 vdpa: 00:03:45.557 00:03:45.557 00:03:45.557 Message: 00:03:45.557 ================= 00:03:45.557 Content Skipped 00:03:45.557 ================= 00:03:45.557 00:03:45.557 apps: 00:03:45.557 dumpcap: explicitly disabled via build config 00:03:45.557 graph: explicitly disabled via build config 00:03:45.557 pdump: explicitly disabled via build config 00:03:45.557 proc-info: explicitly disabled via build config 00:03:45.557 test-acl: explicitly disabled via build config 00:03:45.557 test-bbdev: explicitly disabled via build config 00:03:45.557 test-cmdline: explicitly disabled via build config 00:03:45.558 test-compress-perf: explicitly disabled via build config 00:03:45.558 test-crypto-perf: explicitly disabled via build config 00:03:45.558 test-dma-perf: explicitly disabled via build config 00:03:45.558 test-eventdev: explicitly disabled via build config 00:03:45.558 test-fib: explicitly disabled via build config 00:03:45.558 test-flow-perf: explicitly disabled via build config 00:03:45.558 test-gpudev: explicitly disabled via build config 00:03:45.558 test-mldev: explicitly disabled via build config 00:03:45.558 test-pipeline: explicitly disabled via build config 00:03:45.558 test-pmd: explicitly disabled via build config 00:03:45.558 test-regex: explicitly disabled via build config 00:03:45.558 test-sad: explicitly disabled via build config 00:03:45.558 test-security-perf: explicitly disabled via build config 00:03:45.558 00:03:45.558 libs: 00:03:45.558 argparse: explicitly disabled via build config 00:03:45.558 metrics: explicitly disabled via build config 00:03:45.558 acl: explicitly disabled via build config 00:03:45.558 bbdev: explicitly disabled via build config 00:03:45.558 bitratestats: explicitly disabled via build config 00:03:45.558 bpf: explicitly disabled via build config 00:03:45.558 cfgfile: explicitly disabled via build config 00:03:45.558 distributor: explicitly disabled via build config 00:03:45.558 efd: explicitly disabled via build config 00:03:45.558 eventdev: explicitly disabled via build config 00:03:45.558 dispatcher: explicitly disabled via build config 00:03:45.558 gpudev: explicitly disabled via build config 00:03:45.558 gro: explicitly disabled via build config 00:03:45.558 gso: explicitly disabled via build config 00:03:45.558 ip_frag: explicitly disabled via build config 00:03:45.558 jobstats: explicitly disabled via build config 00:03:45.558 latencystats: explicitly disabled via build config 00:03:45.558 lpm: explicitly disabled via build config 00:03:45.558 member: explicitly disabled via build config 00:03:45.558 pcapng: explicitly disabled via build config 00:03:45.558 rawdev: explicitly disabled via build config 00:03:45.558 regexdev: explicitly disabled via build config 00:03:45.558 mldev: explicitly disabled via build config 00:03:45.558 rib: explicitly disabled via build config 00:03:45.558 sched: explicitly disabled via build config 00:03:45.558 stack: explicitly disabled via build config 00:03:45.558 ipsec: explicitly disabled via build config 00:03:45.558 pdcp: explicitly disabled via build config 00:03:45.558 fib: explicitly disabled via build config 00:03:45.558 port: explicitly disabled via build config 00:03:45.558 pdump: explicitly disabled via build config 00:03:45.558 table: explicitly disabled via build config 00:03:45.558 pipeline: explicitly disabled via build config 00:03:45.558 graph: explicitly disabled via build config 00:03:45.558 node: explicitly disabled via build config 00:03:45.558 00:03:45.558 drivers: 00:03:45.558 common/cpt: not in enabled drivers build config 00:03:45.558 common/dpaax: not in enabled drivers build config 00:03:45.558 common/iavf: not in enabled drivers build config 00:03:45.558 common/idpf: not in enabled drivers build config 00:03:45.558 common/ionic: not in enabled drivers build config 00:03:45.558 common/mvep: not in enabled drivers build config 00:03:45.558 common/octeontx: not in enabled drivers build config 00:03:45.558 bus/cdx: not in enabled drivers build config 00:03:45.558 bus/dpaa: not in enabled drivers build config 00:03:45.558 bus/fslmc: not in enabled drivers build config 00:03:45.558 bus/ifpga: not in enabled drivers build config 00:03:45.558 bus/platform: not in enabled drivers build config 00:03:45.558 bus/uacce: not in enabled drivers build config 00:03:45.558 bus/vmbus: not in enabled drivers build config 00:03:45.558 common/cnxk: not in enabled drivers build config 00:03:45.558 common/nfp: not in enabled drivers build config 00:03:45.558 common/nitrox: not in enabled drivers build config 00:03:45.558 common/sfc_efx: not in enabled drivers build config 00:03:45.558 mempool/bucket: not in enabled drivers build config 00:03:45.558 mempool/cnxk: not in enabled drivers build config 00:03:45.558 mempool/dpaa: not in enabled drivers build config 00:03:45.558 mempool/dpaa2: not in enabled drivers build config 00:03:45.558 mempool/octeontx: not in enabled drivers build config 00:03:45.558 mempool/stack: not in enabled drivers build config 00:03:45.558 dma/cnxk: not in enabled drivers build config 00:03:45.558 dma/dpaa: not in enabled drivers build config 00:03:45.558 dma/dpaa2: not in enabled drivers build config 00:03:45.558 dma/hisilicon: not in enabled drivers build config 00:03:45.558 dma/idxd: not in enabled drivers build config 00:03:45.558 dma/ioat: not in enabled drivers build config 00:03:45.558 dma/skeleton: not in enabled drivers build config 00:03:45.558 net/af_packet: not in enabled drivers build config 00:03:45.558 net/af_xdp: not in enabled drivers build config 00:03:45.558 net/ark: not in enabled drivers build config 00:03:45.558 net/atlantic: not in enabled drivers build config 00:03:45.558 net/avp: not in enabled drivers build config 00:03:45.558 net/axgbe: not in enabled drivers build config 00:03:45.558 net/bnx2x: not in enabled drivers build config 00:03:45.558 net/bnxt: not in enabled drivers build config 00:03:45.558 net/bonding: not in enabled drivers build config 00:03:45.558 net/cnxk: not in enabled drivers build config 00:03:45.558 net/cpfl: not in enabled drivers build config 00:03:45.558 net/cxgbe: not in enabled drivers build config 00:03:45.558 net/dpaa: not in enabled drivers build config 00:03:45.558 net/dpaa2: not in enabled drivers build config 00:03:45.558 net/e1000: not in enabled drivers build config 00:03:45.558 net/ena: not in enabled drivers build config 00:03:45.558 net/enetc: not in enabled drivers build config 00:03:45.558 net/enetfec: not in enabled drivers build config 00:03:45.558 net/enic: not in enabled drivers build config 00:03:45.558 net/failsafe: not in enabled drivers build config 00:03:45.558 net/fm10k: not in enabled drivers build config 00:03:45.558 net/gve: not in enabled drivers build config 00:03:45.558 net/hinic: not in enabled drivers build config 00:03:45.558 net/hns3: not in enabled drivers build config 00:03:45.558 net/i40e: not in enabled drivers build config 00:03:45.558 net/iavf: not in enabled drivers build config 00:03:45.558 net/ice: not in enabled drivers build config 00:03:45.558 net/idpf: not in enabled drivers build config 00:03:45.558 net/igc: not in enabled drivers build config 00:03:45.558 net/ionic: not in enabled drivers build config 00:03:45.558 net/ipn3ke: not in enabled drivers build config 00:03:45.558 net/ixgbe: not in enabled drivers build config 00:03:45.558 net/mana: not in enabled drivers build config 00:03:45.558 net/memif: not in enabled drivers build config 00:03:45.558 net/mlx4: not in enabled drivers build config 00:03:45.558 net/mlx5: not in enabled drivers build config 00:03:45.558 net/mvneta: not in enabled drivers build config 00:03:45.558 net/mvpp2: not in enabled drivers build config 00:03:45.558 net/netvsc: not in enabled drivers build config 00:03:45.558 net/nfb: not in enabled drivers build config 00:03:45.558 net/nfp: not in enabled drivers build config 00:03:45.558 net/ngbe: not in enabled drivers build config 00:03:45.558 net/null: not in enabled drivers build config 00:03:45.558 net/octeontx: not in enabled drivers build config 00:03:45.558 net/octeon_ep: not in enabled drivers build config 00:03:45.558 net/pcap: not in enabled drivers build config 00:03:45.558 net/pfe: not in enabled drivers build config 00:03:45.558 net/qede: not in enabled drivers build config 00:03:45.558 net/ring: not in enabled drivers build config 00:03:45.558 net/sfc: not in enabled drivers build config 00:03:45.558 net/softnic: not in enabled drivers build config 00:03:45.558 net/tap: not in enabled drivers build config 00:03:45.558 net/thunderx: not in enabled drivers build config 00:03:45.558 net/txgbe: not in enabled drivers build config 00:03:45.558 net/vdev_netvsc: not in enabled drivers build config 00:03:45.558 net/vhost: not in enabled drivers build config 00:03:45.558 net/virtio: not in enabled drivers build config 00:03:45.558 net/vmxnet3: not in enabled drivers build config 00:03:45.558 raw/*: missing internal dependency, "rawdev" 00:03:45.558 crypto/armv8: not in enabled drivers build config 00:03:45.558 crypto/bcmfs: not in enabled drivers build config 00:03:45.558 crypto/caam_jr: not in enabled drivers build config 00:03:45.558 crypto/ccp: not in enabled drivers build config 00:03:45.558 crypto/cnxk: not in enabled drivers build config 00:03:45.558 crypto/dpaa_sec: not in enabled drivers build config 00:03:45.558 crypto/dpaa2_sec: not in enabled drivers build config 00:03:45.558 crypto/mvsam: not in enabled drivers build config 00:03:45.558 crypto/nitrox: not in enabled drivers build config 00:03:45.558 crypto/null: not in enabled drivers build config 00:03:45.558 crypto/octeontx: not in enabled drivers build config 00:03:45.558 crypto/openssl: not in enabled drivers build config 00:03:45.558 crypto/scheduler: not in enabled drivers build config 00:03:45.558 crypto/uadk: not in enabled drivers build config 00:03:45.559 crypto/virtio: not in enabled drivers build config 00:03:45.559 compress/nitrox: not in enabled drivers build config 00:03:45.559 compress/octeontx: not in enabled drivers build config 00:03:45.559 compress/zlib: not in enabled drivers build config 00:03:45.559 regex/*: missing internal dependency, "regexdev" 00:03:45.559 ml/*: missing internal dependency, "mldev" 00:03:45.559 vdpa/ifc: not in enabled drivers build config 00:03:45.559 vdpa/mlx5: not in enabled drivers build config 00:03:45.559 vdpa/nfp: not in enabled drivers build config 00:03:45.559 vdpa/sfc: not in enabled drivers build config 00:03:45.559 event/*: missing internal dependency, "eventdev" 00:03:45.559 baseband/*: missing internal dependency, "bbdev" 00:03:45.559 gpu/*: missing internal dependency, "gpudev" 00:03:45.559 00:03:45.559 00:03:46.129 Build targets in project: 114 00:03:46.129 00:03:46.129 DPDK 24.03.0 00:03:46.129 00:03:46.130 User defined options 00:03:46.130 buildtype : debug 00:03:46.130 default_library : shared 00:03:46.130 libdir : lib 00:03:46.130 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:03:46.130 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:03:46.130 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:03:46.130 cpu_instruction_set: native 00:03:46.130 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:03:46.130 disable_libs : bbdev,argparse,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:03:46.130 enable_docs : false 00:03:46.130 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:03:46.130 enable_kmods : false 00:03:46.130 max_lcores : 128 00:03:46.130 tests : false 00:03:46.130 00:03:46.130 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:46.391 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:03:46.658 [1/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:03:46.658 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:03:46.658 [3/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:03:46.658 [4/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:03:46.658 [5/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:03:46.658 [6/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:03:46.658 [7/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:03:46.658 [8/377] Linking static target lib/librte_kvargs.a 00:03:46.658 [9/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:03:46.658 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:03:46.948 [11/377] Linking static target lib/librte_log.a 00:03:46.948 [12/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:03:46.948 [13/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:03:46.948 [14/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:03:46.948 [15/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:03:46.948 [16/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:03:46.948 [17/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:03:46.948 [18/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:03:46.948 [19/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:03:46.948 [20/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:03:46.948 [21/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:03:46.948 [22/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:03:46.948 [23/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:03:46.948 [24/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:03:46.948 [25/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:03:46.948 [26/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:03:47.207 [27/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:03:47.207 [28/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:03:47.207 [29/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:03:47.207 [30/377] Linking static target lib/librte_pci.a 00:03:47.207 [31/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:03:47.207 [32/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:03:47.207 [33/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:03:47.207 [34/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:03:47.207 [35/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:03:47.474 [36/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:03:47.474 [37/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:03:47.474 [38/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:03:47.474 [39/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:03:47.474 [40/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:03:47.474 [41/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:03:47.474 [42/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:03:47.474 [43/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:03:47.474 [44/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:03:47.474 [45/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:03:47.474 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:03:47.474 [47/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:03:47.734 [48/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:03:47.734 [49/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:03:47.734 [50/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:03:47.734 [51/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:03:47.734 [52/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:03:47.735 [53/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:03:47.735 [54/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:03:47.735 [55/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:03:47.735 [56/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:03:47.735 [57/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:03:47.735 [58/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:03:47.735 [59/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:03:47.735 [60/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:03:47.735 [61/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:03:47.735 [62/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:03:47.735 [63/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:03:47.735 [64/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:03:47.735 [65/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:03:47.735 [66/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:03:47.735 [67/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:03:47.735 [68/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:03:47.735 [69/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:03:47.735 [70/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:03:47.735 [71/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:03:47.735 [72/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:03:47.735 [73/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:03:47.735 [74/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:03:47.735 [75/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:03:47.735 [76/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:03:47.735 [77/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:03:47.735 [78/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:03:47.735 [79/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:03:47.735 [80/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:03:47.735 [81/377] Linking static target lib/librte_telemetry.a 00:03:47.735 [82/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:03:47.735 [83/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:03:47.735 [84/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:03:47.735 [85/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:03:47.735 [86/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:03:47.735 [87/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:03:47.735 [88/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:03:47.735 [89/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:03:47.735 [90/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:47.735 [91/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:03:47.735 [92/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:03:47.735 [93/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:03:47.735 [94/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:03:47.735 [95/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:03:47.735 [96/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:03:47.735 [97/377] Linking static target lib/librte_timer.a 00:03:47.735 [98/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:03:47.735 [99/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:03:47.735 [100/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:03:48.003 [101/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:03:48.003 [102/377] Linking static target lib/librte_ring.a 00:03:48.003 [103/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:03:48.003 [104/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:03:48.003 [105/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:03:48.003 [106/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:03:48.003 [107/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:03:48.003 [108/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:03:48.003 [109/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:03:48.003 [110/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:03:48.003 [111/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.003 [112/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:03:48.003 [113/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:03:48.003 [114/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:03:48.003 [115/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:03:48.003 [116/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:03:48.003 [117/377] Linking static target lib/librte_cmdline.a 00:03:48.003 [118/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:03:48.003 [119/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:03:48.003 [120/377] Linking static target lib/librte_dmadev.a 00:03:48.003 [121/377] Linking static target lib/librte_net.a 00:03:48.003 [122/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:03:48.003 [123/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:03:48.003 [124/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:03:48.003 [125/377] Linking static target lib/librte_compressdev.a 00:03:48.003 [126/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:03:48.003 [127/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:03:48.003 [128/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:03:48.003 [129/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:03:48.003 [130/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:03:48.003 [131/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:03:48.003 [132/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:03:48.003 [133/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:03:48.003 [134/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:03:48.003 [135/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:03:48.003 [136/377] Linking static target lib/librte_rcu.a 00:03:48.262 [137/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:03:48.262 [138/377] Linking static target lib/librte_mempool.a 00:03:48.262 [139/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:03:48.262 [140/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:03:48.262 [141/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:03:48.262 [142/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:03:48.262 [143/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:03:48.262 [144/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:03:48.262 [145/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:03:48.262 [146/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:03:48.262 [147/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:03:48.262 [148/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:03:48.262 [149/377] Linking static target lib/librte_reorder.a 00:03:48.262 [150/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:03:48.262 [151/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:03:48.262 [152/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:03:48.262 [153/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:03:48.262 [154/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:03:48.262 [155/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:03:48.262 [156/377] Linking static target lib/librte_hash.a 00:03:48.262 [157/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:03:48.262 [158/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:03:48.262 [159/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:03:48.262 [160/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:03:48.262 [161/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:03:48.262 [162/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [163/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:03:48.521 [164/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:03:48.521 [165/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:03:48.521 [166/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:03:48.521 [167/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [168/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:03:48.521 [169/377] Linking target lib/librte_log.so.24.1 00:03:48.521 [170/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:03:48.521 [171/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:03:48.521 [172/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:03:48.521 [173/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [174/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:03:48.521 [175/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:03:48.521 [176/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:03:48.521 [177/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [178/377] Linking static target lib/librte_meter.a 00:03:48.521 [179/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:03:48.521 [180/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:03:48.521 [181/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:03:48.521 [182/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:03:48.521 [183/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:03:48.521 [184/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:03:48.521 [185/377] Linking static target lib/librte_security.a 00:03:48.521 [186/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:03:48.521 [187/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [188/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:03:48.521 [189/377] Linking static target lib/librte_mbuf.a 00:03:48.521 [190/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:03:48.521 [191/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:03:48.521 [192/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:03:48.521 [193/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:03:48.521 [194/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.521 [195/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:03:48.521 [196/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:03:48.521 [197/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:03:48.521 [198/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:03:48.521 [199/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:03:48.521 [200/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:03:48.521 [201/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:03:48.521 [202/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:03:48.521 [203/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:03:48.521 [204/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:03:48.521 [205/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:03:48.521 [206/377] Linking static target lib/librte_eal.a 00:03:48.521 [207/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:03:48.521 [208/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:03:48.521 [209/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:48.521 [210/377] Linking target lib/librte_kvargs.so.24.1 00:03:48.521 [211/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:03:48.521 [212/377] Linking target lib/librte_telemetry.so.24.1 00:03:48.521 [213/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:03:48.521 [214/377] Linking static target drivers/librte_bus_auxiliary.a 00:03:48.521 [215/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:03:48.521 [216/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:03:48.522 [217/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:03:48.522 [218/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:03:48.522 [219/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:48.522 [220/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:03:48.522 [221/377] Linking static target drivers/librte_bus_vdev.a 00:03:48.522 [222/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:03:48.522 [223/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:03:48.522 [224/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:03:48.522 [225/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:03:48.522 [226/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:03:48.781 [227/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:03:48.781 [228/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:03:48.781 [229/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:03:48.781 [230/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:03:48.781 [231/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:03:48.781 [232/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.781 [233/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.781 [234/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:03:48.781 [235/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:03:48.781 [236/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:03:48.781 [237/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:03:48.781 [238/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:03:48.781 [239/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:03:48.781 [240/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:03:48.781 [241/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:03:48.781 [242/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:03:48.781 [243/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:03:48.781 [244/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:03:48.781 [245/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:03:48.781 [246/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:03:48.781 [247/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:03:48.781 [248/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:03:48.781 [249/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:03:48.781 [250/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:03:48.781 [251/377] Linking static target lib/librte_power.a 00:03:48.781 [252/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:03:48.781 [253/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:03:48.781 [254/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:03:48.781 [255/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:03:48.781 [256/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.781 [257/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:03:48.781 [258/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.781 [259/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:03:48.781 [260/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:48.781 [261/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:03:48.781 [262/377] Linking static target lib/librte_cryptodev.a 00:03:48.781 [263/377] Linking static target drivers/librte_mempool_ring.a 00:03:48.781 [264/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:03:48.781 [265/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:03:48.781 [266/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:03:48.781 [267/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:03:48.781 [268/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:03:48.781 [269/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:03:48.781 [270/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:03:48.781 [271/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:03:48.781 [272/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:03:48.781 [273/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:03:48.781 [274/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:03:48.781 [275/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:48.781 [276/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:03:49.040 [277/377] Linking static target drivers/librte_bus_pci.a 00:03:49.040 [278/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:03:49.040 [279/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:03:49.040 [280/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.040 [281/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:49.040 [282/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:03:49.040 [283/377] Linking static target drivers/librte_compress_isal.a 00:03:49.040 [284/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.040 [285/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:03:49.040 [286/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:03:49.040 [287/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:03:49.040 [288/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:03:49.040 [289/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:49.040 [290/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:03:49.040 [291/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:03:49.040 [292/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:03:49.040 [293/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:03:49.040 [294/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.040 [295/377] Linking static target drivers/librte_compress_mlx5.a 00:03:49.040 [296/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:03:49.040 [297/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:49.040 [298/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:03:49.040 [299/377] Linking static target drivers/librte_crypto_mlx5.a 00:03:49.040 [300/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:03:49.298 [301/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.298 [302/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:03:49.298 [303/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.298 [304/377] Linking static target lib/librte_ethdev.a 00:03:49.298 [305/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:03:49.298 [306/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:03:49.298 [307/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.298 [308/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:03:49.298 [309/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:49.298 [310/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:03:49.298 [311/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:49.298 [312/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:03:49.298 [313/377] Linking static target drivers/librte_common_mlx5.a 00:03:49.298 [314/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:03:49.556 [315/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:03:49.556 [316/377] Linking static target drivers/libtmp_rte_common_qat.a 00:03:49.816 [317/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.816 [318/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:03:49.816 [319/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:03:49.816 [320/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:49.816 [321/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:03:49.816 [322/377] Linking static target drivers/librte_common_qat.a 00:03:50.076 [323/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:03:50.076 [324/377] Linking static target lib/librte_vhost.a 00:03:51.019 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:03:52.404 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:03:54.984 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:03:59.189 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:04:00.575 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:04:00.575 [330/377] Linking target lib/librte_eal.so.24.1 00:04:00.835 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:04:00.835 [332/377] Linking target lib/librte_ring.so.24.1 00:04:00.835 [333/377] Linking target lib/librte_meter.so.24.1 00:04:00.835 [334/377] Linking target lib/librte_timer.so.24.1 00:04:00.835 [335/377] Linking target lib/librte_dmadev.so.24.1 00:04:00.835 [336/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:04:00.835 [337/377] Linking target drivers/librte_bus_vdev.so.24.1 00:04:00.835 [338/377] Linking target lib/librte_pci.so.24.1 00:04:00.835 [339/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:04:00.835 [340/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:04:00.835 [341/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:04:00.835 [342/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:04:00.835 [343/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:04:00.835 [344/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:04:01.096 [345/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:04:01.096 [346/377] Linking target lib/librte_rcu.so.24.1 00:04:01.096 [347/377] Linking target lib/librte_mempool.so.24.1 00:04:01.096 [348/377] Linking target drivers/librte_bus_pci.so.24.1 00:04:01.096 [349/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:04:01.096 [350/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:04:01.096 [351/377] Linking target drivers/librte_mempool_ring.so.24.1 00:04:01.096 [352/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:04:01.096 [353/377] Linking target lib/librte_mbuf.so.24.1 00:04:01.356 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:04:01.356 [355/377] Linking target lib/librte_compressdev.so.24.1 00:04:01.356 [356/377] Linking target lib/librte_net.so.24.1 00:04:01.356 [357/377] Linking target lib/librte_reorder.so.24.1 00:04:01.356 [358/377] Linking target lib/librte_cryptodev.so.24.1 00:04:01.615 [359/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:04:01.615 [360/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:04:01.615 [361/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:04:01.615 [362/377] Linking target drivers/librte_compress_isal.so.24.1 00:04:01.615 [363/377] Linking target lib/librte_security.so.24.1 00:04:01.615 [364/377] Linking target lib/librte_cmdline.so.24.1 00:04:01.615 [365/377] Linking target lib/librte_ethdev.so.24.1 00:04:01.615 [366/377] Linking target lib/librte_hash.so.24.1 00:04:01.875 [367/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:04:01.875 [368/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:04:01.875 [369/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:04:01.875 [370/377] Linking target lib/librte_power.so.24.1 00:04:01.875 [371/377] Linking target drivers/librte_common_mlx5.so.24.1 00:04:01.875 [372/377] Linking target lib/librte_vhost.so.24.1 00:04:02.136 [373/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:04:02.136 [374/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:04:02.136 [375/377] Linking target drivers/librte_common_qat.so.24.1 00:04:02.136 [376/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:04:02.136 [377/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:04:02.136 INFO: autodetecting backend as ninja 00:04:02.136 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 128 00:04:03.520 CC lib/ut/ut.o 00:04:03.520 CC lib/log/log.o 00:04:03.520 CC lib/log/log_flags.o 00:04:03.520 CC lib/log/log_deprecated.o 00:04:03.520 CC lib/ut_mock/mock.o 00:04:03.520 LIB libspdk_ut_mock.a 00:04:03.520 LIB libspdk_ut.a 00:04:03.520 LIB libspdk_log.a 00:04:03.520 SO libspdk_ut_mock.so.6.0 00:04:03.520 SO libspdk_ut.so.2.0 00:04:03.520 SO libspdk_log.so.7.0 00:04:03.520 SYMLINK libspdk_ut_mock.so 00:04:03.520 SYMLINK libspdk_ut.so 00:04:03.780 SYMLINK libspdk_log.so 00:04:04.040 CC lib/util/base64.o 00:04:04.040 CC lib/util/bit_array.o 00:04:04.040 CC lib/util/cpuset.o 00:04:04.040 CC lib/util/crc16.o 00:04:04.040 CC lib/util/crc32.o 00:04:04.040 CC lib/util/crc32c.o 00:04:04.040 CC lib/util/crc32_ieee.o 00:04:04.040 CC lib/util/dif.o 00:04:04.040 CC lib/util/crc64.o 00:04:04.040 CC lib/util/file.o 00:04:04.040 CC lib/dma/dma.o 00:04:04.040 CC lib/util/fd.o 00:04:04.040 CC lib/util/fd_group.o 00:04:04.040 CC lib/util/iov.o 00:04:04.040 CC lib/util/hexlify.o 00:04:04.040 CXX lib/trace_parser/trace.o 00:04:04.040 CC lib/util/math.o 00:04:04.040 CC lib/util/net.o 00:04:04.040 CC lib/ioat/ioat.o 00:04:04.040 CC lib/util/pipe.o 00:04:04.040 CC lib/util/strerror_tls.o 00:04:04.040 CC lib/util/string.o 00:04:04.040 CC lib/util/uuid.o 00:04:04.040 CC lib/util/xor.o 00:04:04.040 CC lib/util/zipf.o 00:04:04.300 CC lib/vfio_user/host/vfio_user_pci.o 00:04:04.300 CC lib/vfio_user/host/vfio_user.o 00:04:04.300 LIB libspdk_dma.a 00:04:04.300 SO libspdk_dma.so.4.0 00:04:04.300 LIB libspdk_ioat.a 00:04:04.300 SO libspdk_ioat.so.7.0 00:04:04.300 SYMLINK libspdk_dma.so 00:04:04.560 SYMLINK libspdk_ioat.so 00:04:04.560 LIB libspdk_vfio_user.a 00:04:04.560 SO libspdk_vfio_user.so.5.0 00:04:04.560 LIB libspdk_util.a 00:04:04.560 SYMLINK libspdk_vfio_user.so 00:04:04.560 SO libspdk_util.so.9.1 00:04:04.822 SYMLINK libspdk_util.so 00:04:04.822 LIB libspdk_trace_parser.a 00:04:04.822 SO libspdk_trace_parser.so.5.0 00:04:05.082 SYMLINK libspdk_trace_parser.so 00:04:05.082 CC lib/json/json_parse.o 00:04:05.082 CC lib/json/json_util.o 00:04:05.082 CC lib/json/json_write.o 00:04:05.082 CC lib/vmd/vmd.o 00:04:05.082 CC lib/vmd/led.o 00:04:05.082 CC lib/conf/conf.o 00:04:05.082 CC lib/idxd/idxd.o 00:04:05.082 CC lib/idxd/idxd_user.o 00:04:05.082 CC lib/rdma_utils/rdma_utils.o 00:04:05.082 CC lib/idxd/idxd_kernel.o 00:04:05.082 CC lib/rdma_provider/common.o 00:04:05.082 CC lib/env_dpdk/env.o 00:04:05.082 CC lib/rdma_provider/rdma_provider_verbs.o 00:04:05.082 CC lib/env_dpdk/memory.o 00:04:05.082 CC lib/reduce/reduce.o 00:04:05.082 CC lib/env_dpdk/pci.o 00:04:05.082 CC lib/env_dpdk/init.o 00:04:05.082 CC lib/env_dpdk/threads.o 00:04:05.082 CC lib/env_dpdk/pci_ioat.o 00:04:05.082 CC lib/env_dpdk/pci_virtio.o 00:04:05.082 CC lib/env_dpdk/pci_vmd.o 00:04:05.082 CC lib/env_dpdk/pci_idxd.o 00:04:05.082 CC lib/env_dpdk/pci_event.o 00:04:05.082 CC lib/env_dpdk/sigbus_handler.o 00:04:05.082 CC lib/env_dpdk/pci_dpdk.o 00:04:05.082 CC lib/env_dpdk/pci_dpdk_2207.o 00:04:05.082 CC lib/env_dpdk/pci_dpdk_2211.o 00:04:05.343 LIB libspdk_rdma_provider.a 00:04:05.343 LIB libspdk_conf.a 00:04:05.343 SO libspdk_rdma_provider.so.6.0 00:04:05.343 SO libspdk_conf.so.6.0 00:04:05.343 LIB libspdk_json.a 00:04:05.603 SYMLINK libspdk_rdma_provider.so 00:04:05.603 SYMLINK libspdk_conf.so 00:04:05.603 SO libspdk_json.so.6.0 00:04:05.603 SYMLINK libspdk_json.so 00:04:05.603 LIB libspdk_idxd.a 00:04:05.603 SO libspdk_idxd.so.12.0 00:04:05.603 LIB libspdk_reduce.a 00:04:05.603 LIB libspdk_vmd.a 00:04:05.864 LIB libspdk_rdma_utils.a 00:04:05.864 SO libspdk_reduce.so.6.0 00:04:05.864 SO libspdk_vmd.so.6.0 00:04:05.864 SYMLINK libspdk_idxd.so 00:04:05.864 SO libspdk_rdma_utils.so.1.0 00:04:05.864 SYMLINK libspdk_vmd.so 00:04:05.864 SYMLINK libspdk_reduce.so 00:04:05.864 SYMLINK libspdk_rdma_utils.so 00:04:05.864 CC lib/jsonrpc/jsonrpc_server.o 00:04:05.864 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:04:05.864 CC lib/jsonrpc/jsonrpc_client.o 00:04:05.864 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:04:06.125 LIB libspdk_jsonrpc.a 00:04:06.125 SO libspdk_jsonrpc.so.6.0 00:04:06.386 SYMLINK libspdk_jsonrpc.so 00:04:06.386 LIB libspdk_env_dpdk.a 00:04:06.386 SO libspdk_env_dpdk.so.15.0 00:04:06.646 SYMLINK libspdk_env_dpdk.so 00:04:06.646 CC lib/rpc/rpc.o 00:04:06.907 LIB libspdk_rpc.a 00:04:06.907 SO libspdk_rpc.so.6.0 00:04:06.907 SYMLINK libspdk_rpc.so 00:04:07.478 CC lib/keyring/keyring.o 00:04:07.478 CC lib/keyring/keyring_rpc.o 00:04:07.478 CC lib/notify/notify.o 00:04:07.478 CC lib/notify/notify_rpc.o 00:04:07.478 CC lib/trace/trace.o 00:04:07.478 CC lib/trace/trace_flags.o 00:04:07.478 CC lib/trace/trace_rpc.o 00:04:07.478 LIB libspdk_notify.a 00:04:07.478 SO libspdk_notify.so.6.0 00:04:07.478 LIB libspdk_keyring.a 00:04:07.478 LIB libspdk_trace.a 00:04:07.478 SO libspdk_keyring.so.1.0 00:04:07.739 SYMLINK libspdk_notify.so 00:04:07.739 SO libspdk_trace.so.10.0 00:04:07.739 SYMLINK libspdk_keyring.so 00:04:07.739 SYMLINK libspdk_trace.so 00:04:07.999 CC lib/thread/thread.o 00:04:07.999 CC lib/thread/iobuf.o 00:04:07.999 CC lib/sock/sock.o 00:04:07.999 CC lib/sock/sock_rpc.o 00:04:08.569 LIB libspdk_sock.a 00:04:08.569 SO libspdk_sock.so.10.0 00:04:08.569 SYMLINK libspdk_sock.so 00:04:08.829 CC lib/nvme/nvme_ctrlr_cmd.o 00:04:08.829 CC lib/nvme/nvme_ctrlr.o 00:04:08.829 CC lib/nvme/nvme_fabric.o 00:04:08.829 CC lib/nvme/nvme_ns_cmd.o 00:04:08.829 CC lib/nvme/nvme_ns.o 00:04:08.829 CC lib/nvme/nvme_pcie.o 00:04:08.829 CC lib/nvme/nvme_pcie_common.o 00:04:08.829 CC lib/nvme/nvme_qpair.o 00:04:08.829 CC lib/nvme/nvme.o 00:04:08.829 CC lib/nvme/nvme_quirks.o 00:04:08.829 CC lib/nvme/nvme_transport.o 00:04:08.829 CC lib/nvme/nvme_discovery.o 00:04:08.829 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:04:08.829 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:04:08.829 CC lib/nvme/nvme_tcp.o 00:04:08.829 CC lib/nvme/nvme_opal.o 00:04:08.829 CC lib/nvme/nvme_io_msg.o 00:04:08.829 CC lib/nvme/nvme_poll_group.o 00:04:08.829 CC lib/nvme/nvme_zns.o 00:04:08.829 CC lib/nvme/nvme_stubs.o 00:04:08.829 CC lib/nvme/nvme_auth.o 00:04:08.829 CC lib/nvme/nvme_cuse.o 00:04:08.829 CC lib/nvme/nvme_rdma.o 00:04:09.399 LIB libspdk_thread.a 00:04:09.399 SO libspdk_thread.so.10.1 00:04:09.399 SYMLINK libspdk_thread.so 00:04:09.659 CC lib/accel/accel.o 00:04:09.659 CC lib/accel/accel_rpc.o 00:04:09.659 CC lib/accel/accel_sw.o 00:04:09.659 CC lib/blob/blobstore.o 00:04:09.659 CC lib/blob/zeroes.o 00:04:09.659 CC lib/blob/request.o 00:04:09.659 CC lib/init/json_config.o 00:04:09.659 CC lib/blob/blob_bs_dev.o 00:04:09.659 CC lib/init/rpc.o 00:04:09.659 CC lib/init/subsystem.o 00:04:09.659 CC lib/init/subsystem_rpc.o 00:04:09.659 CC lib/virtio/virtio.o 00:04:09.659 CC lib/virtio/virtio_vhost_user.o 00:04:09.659 CC lib/virtio/virtio_vfio_user.o 00:04:09.659 CC lib/virtio/virtio_pci.o 00:04:09.920 LIB libspdk_init.a 00:04:09.920 SO libspdk_init.so.5.0 00:04:10.181 LIB libspdk_virtio.a 00:04:10.181 SYMLINK libspdk_init.so 00:04:10.181 SO libspdk_virtio.so.7.0 00:04:10.181 SYMLINK libspdk_virtio.so 00:04:10.442 CC lib/event/app.o 00:04:10.442 CC lib/event/reactor.o 00:04:10.442 CC lib/event/log_rpc.o 00:04:10.442 CC lib/event/app_rpc.o 00:04:10.442 CC lib/event/scheduler_static.o 00:04:10.442 LIB libspdk_accel.a 00:04:10.703 SO libspdk_accel.so.15.1 00:04:10.703 SYMLINK libspdk_accel.so 00:04:10.703 LIB libspdk_event.a 00:04:10.964 SO libspdk_event.so.14.0 00:04:10.964 LIB libspdk_nvme.a 00:04:10.964 SYMLINK libspdk_event.so 00:04:10.964 CC lib/bdev/bdev.o 00:04:10.964 CC lib/bdev/bdev_rpc.o 00:04:10.964 CC lib/bdev/bdev_zone.o 00:04:10.964 CC lib/bdev/part.o 00:04:10.964 CC lib/bdev/scsi_nvme.o 00:04:10.964 SO libspdk_nvme.so.13.1 00:04:11.224 SYMLINK libspdk_nvme.so 00:04:12.165 LIB libspdk_blob.a 00:04:12.165 SO libspdk_blob.so.11.0 00:04:12.165 SYMLINK libspdk_blob.so 00:04:12.736 CC lib/lvol/lvol.o 00:04:12.736 CC lib/blobfs/blobfs.o 00:04:12.736 CC lib/blobfs/tree.o 00:04:13.306 LIB libspdk_lvol.a 00:04:13.566 SO libspdk_lvol.so.10.0 00:04:13.566 LIB libspdk_bdev.a 00:04:13.566 SO libspdk_bdev.so.15.1 00:04:13.566 SYMLINK libspdk_lvol.so 00:04:13.566 SYMLINK libspdk_bdev.so 00:04:13.854 CC lib/nvmf/ctrlr.o 00:04:13.854 CC lib/nvmf/ctrlr_discovery.o 00:04:13.854 CC lib/scsi/dev.o 00:04:13.854 CC lib/nvmf/ctrlr_bdev.o 00:04:13.854 CC lib/ftl/ftl_core.o 00:04:13.854 CC lib/nvmf/subsystem.o 00:04:13.854 CC lib/scsi/lun.o 00:04:13.854 CC lib/nvmf/nvmf.o 00:04:13.854 CC lib/ftl/ftl_init.o 00:04:13.854 CC lib/scsi/port.o 00:04:13.854 CC lib/nvmf/nvmf_rpc.o 00:04:13.854 CC lib/ftl/ftl_layout.o 00:04:13.854 CC lib/scsi/scsi.o 00:04:13.854 CC lib/nvmf/transport.o 00:04:13.854 CC lib/scsi/scsi_bdev.o 00:04:13.854 CC lib/ftl/ftl_debug.o 00:04:13.854 CC lib/nvmf/tcp.o 00:04:13.854 CC lib/scsi/scsi_pr.o 00:04:13.854 CC lib/nvmf/stubs.o 00:04:13.854 CC lib/ftl/ftl_io.o 00:04:13.854 CC lib/scsi/scsi_rpc.o 00:04:13.854 CC lib/nvmf/mdns_server.o 00:04:13.854 CC lib/scsi/task.o 00:04:13.854 CC lib/ftl/ftl_sb.o 00:04:13.854 CC lib/nbd/nbd.o 00:04:13.854 CC lib/nvmf/rdma.o 00:04:13.854 CC lib/ftl/ftl_l2p.o 00:04:13.854 CC lib/nvmf/auth.o 00:04:13.854 CC lib/nbd/nbd_rpc.o 00:04:13.854 CC lib/ublk/ublk.o 00:04:13.854 CC lib/ftl/ftl_l2p_flat.o 00:04:14.112 CC lib/ftl/ftl_nv_cache.o 00:04:14.112 CC lib/ublk/ublk_rpc.o 00:04:14.112 CC lib/ftl/ftl_band.o 00:04:14.112 CC lib/ftl/ftl_band_ops.o 00:04:14.112 CC lib/ftl/ftl_writer.o 00:04:14.112 CC lib/ftl/ftl_rq.o 00:04:14.112 CC lib/ftl/ftl_reloc.o 00:04:14.112 CC lib/ftl/ftl_l2p_cache.o 00:04:14.112 CC lib/ftl/ftl_p2l.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_md.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_startup.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_misc.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_band.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:04:14.112 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:04:14.112 CC lib/ftl/utils/ftl_conf.o 00:04:14.112 CC lib/ftl/utils/ftl_md.o 00:04:14.112 CC lib/ftl/utils/ftl_mempool.o 00:04:14.112 CC lib/ftl/utils/ftl_property.o 00:04:14.112 CC lib/ftl/utils/ftl_bitmap.o 00:04:14.112 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:04:14.112 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:04:14.112 CC lib/ftl/upgrade/ftl_sb_v3.o 00:04:14.112 CC lib/ftl/upgrade/ftl_sb_v5.o 00:04:14.112 CC lib/ftl/nvc/ftl_nvc_dev.o 00:04:14.112 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:04:14.112 CC lib/ftl/base/ftl_base_dev.o 00:04:14.112 CC lib/ftl/base/ftl_base_bdev.o 00:04:14.112 CC lib/ftl/ftl_trace.o 00:04:14.682 LIB libspdk_blobfs.a 00:04:14.682 SO libspdk_blobfs.so.10.0 00:04:14.682 LIB libspdk_nbd.a 00:04:14.682 SO libspdk_nbd.so.7.0 00:04:14.682 SYMLINK libspdk_blobfs.so 00:04:14.682 SYMLINK libspdk_nbd.so 00:04:14.682 LIB libspdk_ublk.a 00:04:14.942 SO libspdk_ublk.so.3.0 00:04:14.942 LIB libspdk_scsi.a 00:04:14.942 SO libspdk_scsi.so.9.0 00:04:14.942 SYMLINK libspdk_ublk.so 00:04:15.202 SYMLINK libspdk_scsi.so 00:04:15.202 LIB libspdk_ftl.a 00:04:15.202 SO libspdk_ftl.so.9.0 00:04:15.462 CC lib/iscsi/conn.o 00:04:15.462 CC lib/iscsi/init_grp.o 00:04:15.462 CC lib/iscsi/iscsi.o 00:04:15.462 CC lib/iscsi/md5.o 00:04:15.462 CC lib/iscsi/param.o 00:04:15.462 CC lib/iscsi/portal_grp.o 00:04:15.462 CC lib/vhost/vhost.o 00:04:15.462 CC lib/iscsi/tgt_node.o 00:04:15.462 CC lib/vhost/vhost_rpc.o 00:04:15.462 CC lib/vhost/vhost_scsi.o 00:04:15.462 CC lib/iscsi/iscsi_subsystem.o 00:04:15.462 CC lib/vhost/vhost_blk.o 00:04:15.462 CC lib/iscsi/iscsi_rpc.o 00:04:15.462 CC lib/vhost/rte_vhost_user.o 00:04:15.462 CC lib/iscsi/task.o 00:04:15.723 SYMLINK libspdk_ftl.so 00:04:15.983 LIB libspdk_nvmf.a 00:04:15.983 SO libspdk_nvmf.so.18.1 00:04:16.245 SYMLINK libspdk_nvmf.so 00:04:16.506 LIB libspdk_vhost.a 00:04:16.506 SO libspdk_vhost.so.8.0 00:04:16.506 LIB libspdk_iscsi.a 00:04:16.506 SYMLINK libspdk_vhost.so 00:04:16.766 SO libspdk_iscsi.so.8.0 00:04:16.766 SYMLINK libspdk_iscsi.so 00:04:17.338 CC module/env_dpdk/env_dpdk_rpc.o 00:04:17.598 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:17.598 LIB libspdk_env_dpdk_rpc.a 00:04:17.598 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:04:17.598 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:04:17.598 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:17.598 CC module/scheduler/gscheduler/gscheduler.o 00:04:17.598 CC module/sock/posix/posix.o 00:04:17.598 CC module/accel/iaa/accel_iaa.o 00:04:17.598 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:04:17.598 CC module/accel/iaa/accel_iaa_rpc.o 00:04:17.598 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:04:17.598 CC module/keyring/linux/keyring.o 00:04:17.598 CC module/keyring/linux/keyring_rpc.o 00:04:17.598 CC module/accel/ioat/accel_ioat.o 00:04:17.598 CC module/accel/ioat/accel_ioat_rpc.o 00:04:17.598 CC module/blob/bdev/blob_bdev.o 00:04:17.598 CC module/keyring/file/keyring.o 00:04:17.598 CC module/keyring/file/keyring_rpc.o 00:04:17.598 CC module/accel/dsa/accel_dsa.o 00:04:17.598 CC module/accel/dsa/accel_dsa_rpc.o 00:04:17.598 CC module/accel/error/accel_error.o 00:04:17.598 CC module/accel/error/accel_error_rpc.o 00:04:17.598 SO libspdk_env_dpdk_rpc.so.6.0 00:04:17.598 SYMLINK libspdk_env_dpdk_rpc.so 00:04:17.859 LIB libspdk_scheduler_dpdk_governor.a 00:04:17.859 LIB libspdk_keyring_linux.a 00:04:17.859 LIB libspdk_scheduler_dynamic.a 00:04:17.859 LIB libspdk_keyring_file.a 00:04:17.859 SO libspdk_scheduler_dpdk_governor.so.4.0 00:04:17.859 LIB libspdk_accel_error.a 00:04:17.859 LIB libspdk_accel_ioat.a 00:04:17.859 SO libspdk_scheduler_dynamic.so.4.0 00:04:17.859 SO libspdk_keyring_linux.so.1.0 00:04:17.859 SO libspdk_keyring_file.so.1.0 00:04:17.859 LIB libspdk_accel_iaa.a 00:04:17.859 SYMLINK libspdk_scheduler_dpdk_governor.so 00:04:17.859 SO libspdk_accel_error.so.2.0 00:04:17.859 SO libspdk_accel_ioat.so.6.0 00:04:17.859 SO libspdk_accel_iaa.so.3.0 00:04:17.859 LIB libspdk_accel_dsa.a 00:04:17.859 LIB libspdk_blob_bdev.a 00:04:17.859 SYMLINK libspdk_keyring_linux.so 00:04:17.859 SYMLINK libspdk_scheduler_dynamic.so 00:04:17.859 SYMLINK libspdk_keyring_file.so 00:04:17.859 SYMLINK libspdk_accel_ioat.so 00:04:17.859 SYMLINK libspdk_accel_error.so 00:04:17.859 SO libspdk_blob_bdev.so.11.0 00:04:17.859 SO libspdk_accel_dsa.so.5.0 00:04:17.859 SYMLINK libspdk_accel_iaa.so 00:04:17.859 LIB libspdk_scheduler_gscheduler.a 00:04:17.859 SYMLINK libspdk_blob_bdev.so 00:04:17.859 SYMLINK libspdk_accel_dsa.so 00:04:17.859 SO libspdk_scheduler_gscheduler.so.4.0 00:04:18.120 SYMLINK libspdk_scheduler_gscheduler.so 00:04:18.120 LIB libspdk_sock_posix.a 00:04:18.381 SO libspdk_sock_posix.so.6.0 00:04:18.381 SYMLINK libspdk_sock_posix.so 00:04:18.381 LIB libspdk_accel_dpdk_compressdev.a 00:04:18.642 SO libspdk_accel_dpdk_compressdev.so.3.0 00:04:18.642 CC module/bdev/gpt/gpt.o 00:04:18.642 CC module/bdev/gpt/vbdev_gpt.o 00:04:18.642 CC module/bdev/lvol/vbdev_lvol.o 00:04:18.642 CC module/bdev/error/vbdev_error.o 00:04:18.642 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:18.642 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:18.642 CC module/blobfs/bdev/blobfs_bdev.o 00:04:18.642 CC module/bdev/error/vbdev_error_rpc.o 00:04:18.642 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:18.642 CC module/bdev/malloc/bdev_malloc.o 00:04:18.642 CC module/bdev/delay/vbdev_delay.o 00:04:18.642 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:18.642 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:18.642 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:18.642 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:18.642 CC module/bdev/passthru/vbdev_passthru.o 00:04:18.642 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:18.642 CC module/bdev/null/bdev_null.o 00:04:18.642 CC module/bdev/aio/bdev_aio.o 00:04:18.642 CC module/bdev/aio/bdev_aio_rpc.o 00:04:18.642 CC module/bdev/null/bdev_null_rpc.o 00:04:18.642 CC module/bdev/iscsi/bdev_iscsi.o 00:04:18.642 CC module/bdev/raid/bdev_raid.o 00:04:18.642 CC module/bdev/nvme/bdev_nvme.o 00:04:18.642 CC module/bdev/raid/bdev_raid_rpc.o 00:04:18.642 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:18.642 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:18.642 CC module/bdev/raid/bdev_raid_sb.o 00:04:18.642 CC module/bdev/nvme/nvme_rpc.o 00:04:18.642 CC module/bdev/raid/raid0.o 00:04:18.642 CC module/bdev/nvme/bdev_mdns_client.o 00:04:18.642 CC module/bdev/raid/raid1.o 00:04:18.642 CC module/bdev/nvme/vbdev_opal.o 00:04:18.642 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:18.642 CC module/bdev/raid/concat.o 00:04:18.642 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:18.642 CC module/bdev/ftl/bdev_ftl.o 00:04:18.642 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:18.642 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:18.642 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:18.642 CC module/bdev/compress/vbdev_compress.o 00:04:18.642 CC module/bdev/crypto/vbdev_crypto.o 00:04:18.642 CC module/bdev/compress/vbdev_compress_rpc.o 00:04:18.642 CC module/bdev/split/vbdev_split_rpc.o 00:04:18.642 CC module/bdev/split/vbdev_split.o 00:04:18.642 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:04:18.642 SYMLINK libspdk_accel_dpdk_compressdev.so 00:04:18.642 LIB libspdk_accel_dpdk_cryptodev.a 00:04:18.908 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:04:18.908 LIB libspdk_blobfs_bdev.a 00:04:18.908 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:04:18.908 SO libspdk_blobfs_bdev.so.6.0 00:04:18.908 LIB libspdk_bdev_error.a 00:04:18.908 LIB libspdk_bdev_null.a 00:04:18.908 LIB libspdk_bdev_gpt.a 00:04:18.908 LIB libspdk_bdev_split.a 00:04:18.908 SO libspdk_bdev_error.so.6.0 00:04:18.908 SO libspdk_bdev_null.so.6.0 00:04:18.908 SO libspdk_bdev_gpt.so.6.0 00:04:18.908 LIB libspdk_bdev_passthru.a 00:04:18.908 SYMLINK libspdk_blobfs_bdev.so 00:04:18.908 SO libspdk_bdev_split.so.6.0 00:04:18.908 LIB libspdk_bdev_delay.a 00:04:18.908 LIB libspdk_bdev_ftl.a 00:04:18.908 LIB libspdk_bdev_crypto.a 00:04:18.908 SO libspdk_bdev_passthru.so.6.0 00:04:18.908 SYMLINK libspdk_bdev_error.so 00:04:18.908 SYMLINK libspdk_bdev_null.so 00:04:18.908 LIB libspdk_bdev_zone_block.a 00:04:18.908 LIB libspdk_bdev_iscsi.a 00:04:18.908 SYMLINK libspdk_bdev_gpt.so 00:04:18.908 SO libspdk_bdev_crypto.so.6.0 00:04:18.908 SO libspdk_bdev_ftl.so.6.0 00:04:18.908 LIB libspdk_bdev_aio.a 00:04:18.908 SO libspdk_bdev_delay.so.6.0 00:04:18.908 SYMLINK libspdk_bdev_split.so 00:04:19.169 LIB libspdk_bdev_compress.a 00:04:19.169 SO libspdk_bdev_zone_block.so.6.0 00:04:19.169 SO libspdk_bdev_iscsi.so.6.0 00:04:19.169 SYMLINK libspdk_bdev_passthru.so 00:04:19.169 SO libspdk_bdev_aio.so.6.0 00:04:19.169 SYMLINK libspdk_bdev_crypto.so 00:04:19.169 SO libspdk_bdev_compress.so.6.0 00:04:19.169 SYMLINK libspdk_bdev_ftl.so 00:04:19.169 SYMLINK libspdk_bdev_delay.so 00:04:19.169 SYMLINK libspdk_bdev_aio.so 00:04:19.169 LIB libspdk_bdev_lvol.a 00:04:19.169 SYMLINK libspdk_bdev_iscsi.so 00:04:19.169 LIB libspdk_bdev_virtio.a 00:04:19.169 SYMLINK libspdk_bdev_compress.so 00:04:19.169 SO libspdk_bdev_lvol.so.6.0 00:04:19.169 SYMLINK libspdk_bdev_zone_block.so 00:04:19.169 SO libspdk_bdev_virtio.so.6.0 00:04:19.169 SYMLINK libspdk_bdev_lvol.so 00:04:19.169 SYMLINK libspdk_bdev_virtio.so 00:04:19.430 LIB libspdk_bdev_malloc.a 00:04:19.430 LIB libspdk_bdev_raid.a 00:04:19.430 SO libspdk_bdev_malloc.so.6.0 00:04:19.430 SO libspdk_bdev_raid.so.6.0 00:04:19.430 SYMLINK libspdk_bdev_malloc.so 00:04:19.691 SYMLINK libspdk_bdev_raid.so 00:04:20.635 LIB libspdk_bdev_nvme.a 00:04:20.635 SO libspdk_bdev_nvme.so.7.0 00:04:20.635 SYMLINK libspdk_bdev_nvme.so 00:04:21.577 CC module/event/subsystems/vmd/vmd.o 00:04:21.577 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:21.577 CC module/event/subsystems/scheduler/scheduler.o 00:04:21.577 CC module/event/subsystems/iobuf/iobuf.o 00:04:21.577 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:21.577 CC module/event/subsystems/keyring/keyring.o 00:04:21.577 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:21.577 CC module/event/subsystems/sock/sock.o 00:04:21.577 LIB libspdk_event_iobuf.a 00:04:21.577 LIB libspdk_event_vmd.a 00:04:21.577 LIB libspdk_event_keyring.a 00:04:21.577 LIB libspdk_event_scheduler.a 00:04:21.577 LIB libspdk_event_vhost_blk.a 00:04:21.577 LIB libspdk_event_sock.a 00:04:21.577 SO libspdk_event_scheduler.so.4.0 00:04:21.578 SO libspdk_event_vmd.so.6.0 00:04:21.578 SO libspdk_event_iobuf.so.3.0 00:04:21.578 SO libspdk_event_keyring.so.1.0 00:04:21.578 SO libspdk_event_vhost_blk.so.3.0 00:04:21.578 SO libspdk_event_sock.so.5.0 00:04:21.578 SYMLINK libspdk_event_scheduler.so 00:04:21.578 SYMLINK libspdk_event_iobuf.so 00:04:21.578 SYMLINK libspdk_event_vmd.so 00:04:21.578 SYMLINK libspdk_event_keyring.so 00:04:21.578 SYMLINK libspdk_event_vhost_blk.so 00:04:21.578 SYMLINK libspdk_event_sock.so 00:04:22.150 CC module/event/subsystems/accel/accel.o 00:04:22.150 LIB libspdk_event_accel.a 00:04:22.150 SO libspdk_event_accel.so.6.0 00:04:22.150 SYMLINK libspdk_event_accel.so 00:04:22.722 CC module/event/subsystems/bdev/bdev.o 00:04:22.722 LIB libspdk_event_bdev.a 00:04:22.722 SO libspdk_event_bdev.so.6.0 00:04:22.982 SYMLINK libspdk_event_bdev.so 00:04:23.243 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:23.243 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:23.243 CC module/event/subsystems/nbd/nbd.o 00:04:23.243 CC module/event/subsystems/scsi/scsi.o 00:04:23.243 CC module/event/subsystems/ublk/ublk.o 00:04:23.505 LIB libspdk_event_ublk.a 00:04:23.505 LIB libspdk_event_nbd.a 00:04:23.505 LIB libspdk_event_scsi.a 00:04:23.505 SO libspdk_event_ublk.so.3.0 00:04:23.505 SO libspdk_event_nbd.so.6.0 00:04:23.505 LIB libspdk_event_nvmf.a 00:04:23.505 SO libspdk_event_scsi.so.6.0 00:04:23.505 SYMLINK libspdk_event_ublk.so 00:04:23.505 SO libspdk_event_nvmf.so.6.0 00:04:23.505 SYMLINK libspdk_event_nbd.so 00:04:23.505 SYMLINK libspdk_event_scsi.so 00:04:23.505 SYMLINK libspdk_event_nvmf.so 00:04:23.766 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:24.027 CC module/event/subsystems/iscsi/iscsi.o 00:04:24.027 LIB libspdk_event_vhost_scsi.a 00:04:24.027 LIB libspdk_event_iscsi.a 00:04:24.027 SO libspdk_event_vhost_scsi.so.3.0 00:04:24.027 SO libspdk_event_iscsi.so.6.0 00:04:24.287 SYMLINK libspdk_event_vhost_scsi.so 00:04:24.287 SYMLINK libspdk_event_iscsi.so 00:04:24.287 SO libspdk.so.6.0 00:04:24.287 SYMLINK libspdk.so 00:04:24.861 CC app/trace_record/trace_record.o 00:04:24.861 CXX app/trace/trace.o 00:04:24.862 CC app/spdk_nvme_identify/identify.o 00:04:24.862 CC app/spdk_top/spdk_top.o 00:04:24.862 CC app/spdk_nvme_discover/discovery_aer.o 00:04:24.862 CC app/spdk_lspci/spdk_lspci.o 00:04:24.862 TEST_HEADER include/spdk/accel.h 00:04:24.862 CC app/spdk_nvme_perf/perf.o 00:04:24.862 TEST_HEADER include/spdk/accel_module.h 00:04:24.862 TEST_HEADER include/spdk/assert.h 00:04:24.862 TEST_HEADER include/spdk/barrier.h 00:04:24.862 TEST_HEADER include/spdk/base64.h 00:04:24.862 TEST_HEADER include/spdk/bdev.h 00:04:24.862 TEST_HEADER include/spdk/bdev_module.h 00:04:24.862 TEST_HEADER include/spdk/bdev_zone.h 00:04:24.862 TEST_HEADER include/spdk/bit_array.h 00:04:24.862 TEST_HEADER include/spdk/bit_pool.h 00:04:24.862 TEST_HEADER include/spdk/blob_bdev.h 00:04:24.862 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:24.862 TEST_HEADER include/spdk/blobfs.h 00:04:24.862 CC test/rpc_client/rpc_client_test.o 00:04:24.862 TEST_HEADER include/spdk/blob.h 00:04:24.862 TEST_HEADER include/spdk/conf.h 00:04:24.862 TEST_HEADER include/spdk/config.h 00:04:24.862 TEST_HEADER include/spdk/cpuset.h 00:04:24.862 TEST_HEADER include/spdk/crc16.h 00:04:24.862 TEST_HEADER include/spdk/crc32.h 00:04:24.862 TEST_HEADER include/spdk/crc64.h 00:04:24.862 TEST_HEADER include/spdk/dif.h 00:04:24.862 TEST_HEADER include/spdk/endian.h 00:04:24.862 TEST_HEADER include/spdk/dma.h 00:04:24.862 TEST_HEADER include/spdk/env_dpdk.h 00:04:24.862 TEST_HEADER include/spdk/env.h 00:04:24.862 TEST_HEADER include/spdk/event.h 00:04:24.862 TEST_HEADER include/spdk/fd_group.h 00:04:24.862 TEST_HEADER include/spdk/file.h 00:04:24.862 TEST_HEADER include/spdk/fd.h 00:04:24.862 TEST_HEADER include/spdk/ftl.h 00:04:24.862 TEST_HEADER include/spdk/gpt_spec.h 00:04:24.862 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:24.862 TEST_HEADER include/spdk/hexlify.h 00:04:24.862 TEST_HEADER include/spdk/histogram_data.h 00:04:24.862 TEST_HEADER include/spdk/idxd.h 00:04:24.862 TEST_HEADER include/spdk/init.h 00:04:24.862 TEST_HEADER include/spdk/idxd_spec.h 00:04:24.862 TEST_HEADER include/spdk/ioat_spec.h 00:04:24.862 TEST_HEADER include/spdk/ioat.h 00:04:24.862 TEST_HEADER include/spdk/iscsi_spec.h 00:04:24.862 CC app/iscsi_tgt/iscsi_tgt.o 00:04:24.862 TEST_HEADER include/spdk/json.h 00:04:24.862 TEST_HEADER include/spdk/jsonrpc.h 00:04:24.862 CC app/spdk_dd/spdk_dd.o 00:04:24.862 TEST_HEADER include/spdk/keyring.h 00:04:24.862 TEST_HEADER include/spdk/keyring_module.h 00:04:24.862 TEST_HEADER include/spdk/log.h 00:04:24.862 TEST_HEADER include/spdk/likely.h 00:04:24.862 TEST_HEADER include/spdk/lvol.h 00:04:24.862 TEST_HEADER include/spdk/memory.h 00:04:24.862 TEST_HEADER include/spdk/mmio.h 00:04:24.862 TEST_HEADER include/spdk/nbd.h 00:04:24.862 TEST_HEADER include/spdk/net.h 00:04:24.862 CC app/spdk_tgt/spdk_tgt.o 00:04:24.862 TEST_HEADER include/spdk/notify.h 00:04:24.862 TEST_HEADER include/spdk/nvme.h 00:04:24.862 TEST_HEADER include/spdk/nvme_intel.h 00:04:24.862 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:24.862 TEST_HEADER include/spdk/nvme_spec.h 00:04:24.862 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:24.862 TEST_HEADER include/spdk/nvme_zns.h 00:04:24.862 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:24.862 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:24.862 TEST_HEADER include/spdk/nvmf_spec.h 00:04:24.862 TEST_HEADER include/spdk/nvmf.h 00:04:24.862 TEST_HEADER include/spdk/nvmf_transport.h 00:04:24.862 TEST_HEADER include/spdk/opal_spec.h 00:04:24.862 TEST_HEADER include/spdk/opal.h 00:04:24.862 TEST_HEADER include/spdk/pipe.h 00:04:24.862 TEST_HEADER include/spdk/queue.h 00:04:24.862 TEST_HEADER include/spdk/pci_ids.h 00:04:24.862 TEST_HEADER include/spdk/reduce.h 00:04:24.862 TEST_HEADER include/spdk/rpc.h 00:04:24.862 TEST_HEADER include/spdk/scsi.h 00:04:24.862 TEST_HEADER include/spdk/scheduler.h 00:04:24.862 TEST_HEADER include/spdk/scsi_spec.h 00:04:24.862 TEST_HEADER include/spdk/sock.h 00:04:24.862 TEST_HEADER include/spdk/stdinc.h 00:04:24.862 TEST_HEADER include/spdk/thread.h 00:04:24.862 TEST_HEADER include/spdk/string.h 00:04:24.862 TEST_HEADER include/spdk/trace.h 00:04:24.862 TEST_HEADER include/spdk/tree.h 00:04:24.862 TEST_HEADER include/spdk/trace_parser.h 00:04:24.862 TEST_HEADER include/spdk/ublk.h 00:04:24.862 TEST_HEADER include/spdk/uuid.h 00:04:24.862 TEST_HEADER include/spdk/util.h 00:04:24.862 TEST_HEADER include/spdk/version.h 00:04:24.862 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:24.862 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:24.862 TEST_HEADER include/spdk/vhost.h 00:04:24.862 TEST_HEADER include/spdk/vmd.h 00:04:24.862 TEST_HEADER include/spdk/xor.h 00:04:24.862 TEST_HEADER include/spdk/zipf.h 00:04:24.862 CXX test/cpp_headers/accel.o 00:04:24.862 CXX test/cpp_headers/accel_module.o 00:04:24.862 CXX test/cpp_headers/assert.o 00:04:24.862 CXX test/cpp_headers/barrier.o 00:04:24.862 CXX test/cpp_headers/base64.o 00:04:24.862 CXX test/cpp_headers/bdev.o 00:04:24.862 CXX test/cpp_headers/bdev_zone.o 00:04:24.862 CXX test/cpp_headers/bdev_module.o 00:04:24.862 CXX test/cpp_headers/bit_array.o 00:04:24.862 CXX test/cpp_headers/bit_pool.o 00:04:24.862 CXX test/cpp_headers/blob_bdev.o 00:04:24.862 CXX test/cpp_headers/blobfs.o 00:04:24.862 CC app/nvmf_tgt/nvmf_main.o 00:04:24.862 CXX test/cpp_headers/blob.o 00:04:24.862 CXX test/cpp_headers/blobfs_bdev.o 00:04:24.862 CXX test/cpp_headers/conf.o 00:04:24.862 CXX test/cpp_headers/config.o 00:04:24.862 CXX test/cpp_headers/cpuset.o 00:04:24.862 CXX test/cpp_headers/crc16.o 00:04:24.862 CXX test/cpp_headers/crc64.o 00:04:24.862 CXX test/cpp_headers/crc32.o 00:04:24.862 CXX test/cpp_headers/dif.o 00:04:24.862 CXX test/cpp_headers/dma.o 00:04:24.862 CXX test/cpp_headers/endian.o 00:04:24.862 CXX test/cpp_headers/env_dpdk.o 00:04:24.862 CXX test/cpp_headers/env.o 00:04:24.862 CXX test/cpp_headers/event.o 00:04:24.862 CXX test/cpp_headers/fd.o 00:04:24.862 CXX test/cpp_headers/fd_group.o 00:04:24.862 CXX test/cpp_headers/file.o 00:04:24.862 CXX test/cpp_headers/gpt_spec.o 00:04:24.862 CXX test/cpp_headers/hexlify.o 00:04:24.862 CXX test/cpp_headers/ftl.o 00:04:24.862 CXX test/cpp_headers/histogram_data.o 00:04:24.862 CXX test/cpp_headers/idxd.o 00:04:24.862 CXX test/cpp_headers/idxd_spec.o 00:04:24.862 CXX test/cpp_headers/init.o 00:04:24.862 CXX test/cpp_headers/ioat.o 00:04:24.862 CXX test/cpp_headers/ioat_spec.o 00:04:24.862 CXX test/cpp_headers/jsonrpc.o 00:04:24.862 CXX test/cpp_headers/iscsi_spec.o 00:04:24.862 CXX test/cpp_headers/json.o 00:04:24.862 CXX test/cpp_headers/keyring_module.o 00:04:25.130 CXX test/cpp_headers/lvol.o 00:04:25.130 CXX test/cpp_headers/keyring.o 00:04:25.130 CXX test/cpp_headers/likely.o 00:04:25.130 CXX test/cpp_headers/memory.o 00:04:25.130 CXX test/cpp_headers/log.o 00:04:25.130 CXX test/cpp_headers/net.o 00:04:25.130 CXX test/cpp_headers/mmio.o 00:04:25.130 CXX test/cpp_headers/nvme.o 00:04:25.130 CXX test/cpp_headers/nbd.o 00:04:25.130 CC examples/ioat/verify/verify.o 00:04:25.130 CXX test/cpp_headers/notify.o 00:04:25.130 CXX test/cpp_headers/nvme_intel.o 00:04:25.130 CXX test/cpp_headers/nvme_ocssd.o 00:04:25.130 CXX test/cpp_headers/nvme_zns.o 00:04:25.130 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:25.130 CXX test/cpp_headers/nvme_spec.o 00:04:25.130 CXX test/cpp_headers/nvmf_cmd.o 00:04:25.130 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:25.130 CXX test/cpp_headers/nvmf_spec.o 00:04:25.130 CXX test/cpp_headers/nvmf_transport.o 00:04:25.130 CXX test/cpp_headers/nvmf.o 00:04:25.130 CXX test/cpp_headers/opal.o 00:04:25.130 CXX test/cpp_headers/pci_ids.o 00:04:25.130 CXX test/cpp_headers/opal_spec.o 00:04:25.130 CXX test/cpp_headers/reduce.o 00:04:25.130 CXX test/cpp_headers/pipe.o 00:04:25.130 CXX test/cpp_headers/queue.o 00:04:25.130 CXX test/cpp_headers/scsi.o 00:04:25.130 CXX test/cpp_headers/rpc.o 00:04:25.130 CC test/app/histogram_perf/histogram_perf.o 00:04:25.130 CXX test/cpp_headers/scsi_spec.o 00:04:25.130 CXX test/cpp_headers/scheduler.o 00:04:25.130 CXX test/cpp_headers/sock.o 00:04:25.130 CXX test/cpp_headers/stdinc.o 00:04:25.130 CXX test/cpp_headers/thread.o 00:04:25.130 CXX test/cpp_headers/trace_parser.o 00:04:25.130 CXX test/cpp_headers/string.o 00:04:25.130 CC test/app/jsoncat/jsoncat.o 00:04:25.130 CXX test/cpp_headers/ublk.o 00:04:25.130 CXX test/cpp_headers/trace.o 00:04:25.130 LINK spdk_lspci 00:04:25.130 CXX test/cpp_headers/tree.o 00:04:25.130 CXX test/cpp_headers/uuid.o 00:04:25.130 CXX test/cpp_headers/version.o 00:04:25.130 CXX test/cpp_headers/util.o 00:04:25.130 CXX test/cpp_headers/vfio_user_pci.o 00:04:25.130 CC test/thread/poller_perf/poller_perf.o 00:04:25.130 CXX test/cpp_headers/vfio_user_spec.o 00:04:25.130 CXX test/cpp_headers/vmd.o 00:04:25.130 CXX test/cpp_headers/vhost.o 00:04:25.130 CXX test/cpp_headers/xor.o 00:04:25.130 CC test/app/stub/stub.o 00:04:25.130 CXX test/cpp_headers/zipf.o 00:04:25.130 CC examples/ioat/perf/perf.o 00:04:25.130 CC app/fio/nvme/fio_plugin.o 00:04:25.130 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:25.130 CC test/env/vtophys/vtophys.o 00:04:25.130 CC test/app/bdev_svc/bdev_svc.o 00:04:25.130 CC test/env/memory/memory_ut.o 00:04:25.131 CC test/env/pci/pci_ut.o 00:04:25.131 CC test/dma/test_dma/test_dma.o 00:04:25.131 CC examples/util/zipf/zipf.o 00:04:25.131 CC app/fio/bdev/fio_plugin.o 00:04:25.131 LINK spdk_nvme_discover 00:04:25.131 LINK rpc_client_test 00:04:25.393 LINK spdk_trace_record 00:04:25.393 LINK interrupt_tgt 00:04:25.393 LINK iscsi_tgt 00:04:25.393 LINK spdk_tgt 00:04:25.656 LINK spdk_trace 00:04:25.656 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:25.656 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:25.656 LINK verify 00:04:25.656 CC test/env/mem_callbacks/mem_callbacks.o 00:04:25.656 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:25.656 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:25.656 LINK bdev_svc 00:04:25.917 LINK nvmf_tgt 00:04:25.917 LINK vtophys 00:04:25.917 LINK jsoncat 00:04:25.917 LINK histogram_perf 00:04:25.917 LINK ioat_perf 00:04:25.917 LINK spdk_dd 00:04:25.917 LINK poller_perf 00:04:25.917 LINK stub 00:04:25.917 LINK env_dpdk_post_init 00:04:26.177 LINK zipf 00:04:26.177 LINK spdk_nvme_perf 00:04:26.177 CC app/vhost/vhost.o 00:04:26.438 LINK test_dma 00:04:26.438 LINK nvme_fuzz 00:04:26.438 LINK pci_ut 00:04:26.438 LINK vhost 00:04:26.438 LINK spdk_bdev 00:04:26.438 LINK vhost_fuzz 00:04:26.438 LINK spdk_nvme 00:04:26.698 LINK spdk_top 00:04:26.698 CC test/event/reactor_perf/reactor_perf.o 00:04:26.698 CC test/event/reactor/reactor.o 00:04:26.698 CC test/event/event_perf/event_perf.o 00:04:26.698 CC test/event/app_repeat/app_repeat.o 00:04:26.698 LINK mem_callbacks 00:04:26.698 CC examples/idxd/perf/perf.o 00:04:26.698 CC examples/sock/hello_world/hello_sock.o 00:04:26.698 CC test/event/scheduler/scheduler.o 00:04:26.698 CC examples/vmd/led/led.o 00:04:26.698 CC examples/vmd/lsvmd/lsvmd.o 00:04:26.698 LINK spdk_nvme_identify 00:04:26.698 CC examples/thread/thread/thread_ex.o 00:04:26.698 LINK reactor_perf 00:04:26.698 LINK reactor 00:04:26.698 LINK event_perf 00:04:26.958 LINK led 00:04:26.958 LINK app_repeat 00:04:26.958 LINK lsvmd 00:04:26.958 LINK hello_sock 00:04:26.958 CC test/nvme/boot_partition/boot_partition.o 00:04:26.958 CC test/nvme/aer/aer.o 00:04:26.958 CC test/nvme/overhead/overhead.o 00:04:26.958 CC test/nvme/startup/startup.o 00:04:26.958 CC test/nvme/sgl/sgl.o 00:04:26.958 CC test/nvme/fused_ordering/fused_ordering.o 00:04:26.958 CC test/nvme/e2edp/nvme_dp.o 00:04:26.958 LINK scheduler 00:04:26.958 CC test/nvme/simple_copy/simple_copy.o 00:04:26.958 CC test/nvme/err_injection/err_injection.o 00:04:26.958 CC test/nvme/compliance/nvme_compliance.o 00:04:26.958 CC test/nvme/cuse/cuse.o 00:04:26.958 CC test/nvme/fdp/fdp.o 00:04:26.958 CC test/nvme/reserve/reserve.o 00:04:26.958 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:26.959 CC test/accel/dif/dif.o 00:04:26.959 LINK idxd_perf 00:04:26.959 CC test/nvme/reset/reset.o 00:04:26.959 CC test/nvme/connect_stress/connect_stress.o 00:04:26.959 CC test/blobfs/mkfs/mkfs.o 00:04:26.959 LINK thread 00:04:26.959 LINK memory_ut 00:04:26.959 CC test/lvol/esnap/esnap.o 00:04:27.218 LINK boot_partition 00:04:27.218 LINK err_injection 00:04:27.218 LINK startup 00:04:27.218 LINK reserve 00:04:27.218 LINK fused_ordering 00:04:27.218 LINK doorbell_aers 00:04:27.218 LINK simple_copy 00:04:27.218 LINK mkfs 00:04:27.218 LINK aer 00:04:27.218 LINK overhead 00:04:27.218 LINK sgl 00:04:27.218 LINK reset 00:04:27.218 LINK nvme_compliance 00:04:27.479 LINK fdp 00:04:27.479 LINK iscsi_fuzz 00:04:27.479 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:27.479 CC examples/nvme/hello_world/hello_world.o 00:04:27.479 CC examples/nvme/hotplug/hotplug.o 00:04:27.479 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:27.479 CC examples/nvme/arbitration/arbitration.o 00:04:27.479 CC examples/nvme/reconnect/reconnect.o 00:04:27.479 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:27.479 LINK dif 00:04:27.479 CC examples/nvme/abort/abort.o 00:04:27.479 LINK connect_stress 00:04:27.740 LINK cmb_copy 00:04:27.740 LINK nvme_dp 00:04:27.740 LINK pmr_persistence 00:04:27.740 CC examples/accel/perf/accel_perf.o 00:04:27.740 CC examples/blob/cli/blobcli.o 00:04:27.740 LINK hello_world 00:04:27.740 CC examples/blob/hello_world/hello_blob.o 00:04:27.740 LINK arbitration 00:04:27.740 LINK reconnect 00:04:27.740 LINK abort 00:04:28.001 LINK nvme_manage 00:04:28.001 LINK hello_blob 00:04:28.001 LINK hotplug 00:04:28.001 LINK accel_perf 00:04:28.001 CC test/bdev/bdevio/bdevio.o 00:04:28.261 LINK cuse 00:04:28.261 LINK blobcli 00:04:28.833 CC examples/bdev/hello_world/hello_bdev.o 00:04:28.834 CC examples/bdev/bdevperf/bdevperf.o 00:04:29.094 LINK bdevio 00:04:29.094 LINK hello_bdev 00:04:29.355 LINK bdevperf 00:04:29.927 CC examples/nvmf/nvmf/nvmf.o 00:04:30.499 LINK nvmf 00:04:31.119 LINK esnap 00:04:31.692 00:04:31.692 real 1m19.586s 00:04:31.692 user 13m56.692s 00:04:31.692 sys 7m0.846s 00:04:31.692 15:40:51 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:04:31.692 15:40:51 make -- common/autotest_common.sh@10 -- $ set +x 00:04:31.692 ************************************ 00:04:31.692 END TEST make 00:04:31.692 ************************************ 00:04:31.692 15:40:51 -- common/autotest_common.sh@1142 -- $ return 0 00:04:31.692 15:40:51 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:31.692 15:40:51 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:31.692 15:40:51 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:31.692 15:40:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:31.692 15:40:51 -- pm/common@44 -- $ pid=2322395 00:04:31.692 15:40:51 -- pm/common@50 -- $ kill -TERM 2322395 00:04:31.692 15:40:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:31.692 15:40:51 -- pm/common@44 -- $ pid=2322396 00:04:31.692 15:40:51 -- pm/common@50 -- $ kill -TERM 2322396 00:04:31.692 15:40:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:31.692 15:40:51 -- pm/common@44 -- $ pid=2322398 00:04:31.692 15:40:51 -- pm/common@50 -- $ kill -TERM 2322398 00:04:31.692 15:40:51 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:51 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:31.692 15:40:51 -- pm/common@44 -- $ pid=2322420 00:04:31.692 15:40:51 -- pm/common@50 -- $ sudo -E kill -TERM 2322420 00:04:31.692 15:40:52 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:04:31.692 15:40:52 -- nvmf/common.sh@7 -- # uname -s 00:04:31.692 15:40:52 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:31.692 15:40:52 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:31.692 15:40:52 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:31.692 15:40:52 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:31.692 15:40:52 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:31.692 15:40:52 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:31.692 15:40:52 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:31.692 15:40:52 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:31.692 15:40:52 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:31.692 15:40:52 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:31.692 15:40:52 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:04:31.692 15:40:52 -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:04:31.692 15:40:52 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:31.692 15:40:52 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:31.692 15:40:52 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:31.692 15:40:52 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:31.692 15:40:52 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:04:31.692 15:40:52 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:31.692 15:40:52 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:31.692 15:40:52 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:31.692 15:40:52 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:31.692 15:40:52 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:31.692 15:40:52 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:31.692 15:40:52 -- paths/export.sh@5 -- # export PATH 00:04:31.692 15:40:52 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:31.692 15:40:52 -- nvmf/common.sh@47 -- # : 0 00:04:31.692 15:40:52 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:04:31.692 15:40:52 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:04:31.692 15:40:52 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:31.692 15:40:52 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:31.692 15:40:52 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:31.692 15:40:52 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:04:31.692 15:40:52 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:04:31.692 15:40:52 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:04:31.692 15:40:52 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:31.692 15:40:52 -- spdk/autotest.sh@32 -- # uname -s 00:04:31.692 15:40:52 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:31.692 15:40:52 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:31.692 15:40:52 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:31.692 15:40:52 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:31.692 15:40:52 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:04:31.692 15:40:52 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:31.692 15:40:52 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:31.692 15:40:52 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:31.692 15:40:52 -- spdk/autotest.sh@48 -- # udevadm_pid=2391214 00:04:31.692 15:40:52 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:31.692 15:40:52 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:31.692 15:40:52 -- pm/common@17 -- # local monitor 00:04:31.692 15:40:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:52 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:31.692 15:40:52 -- pm/common@21 -- # date +%s 00:04:31.692 15:40:52 -- pm/common@25 -- # sleep 1 00:04:31.692 15:40:52 -- pm/common@21 -- # date +%s 00:04:31.692 15:40:52 -- pm/common@21 -- # date +%s 00:04:31.692 15:40:52 -- pm/common@21 -- # date +%s 00:04:31.692 15:40:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720791652 00:04:31.692 15:40:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720791652 00:04:31.692 15:40:52 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720791652 00:04:31.692 15:40:52 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1720791652 00:04:31.692 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720791652_collect-vmstat.pm.log 00:04:31.692 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720791652_collect-cpu-load.pm.log 00:04:31.692 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720791652_collect-cpu-temp.pm.log 00:04:31.953 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1720791652_collect-bmc-pm.bmc.pm.log 00:04:32.895 15:40:53 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:32.895 15:40:53 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:32.895 15:40:53 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:32.895 15:40:53 -- common/autotest_common.sh@10 -- # set +x 00:04:32.895 15:40:53 -- spdk/autotest.sh@59 -- # create_test_list 00:04:32.895 15:40:53 -- common/autotest_common.sh@746 -- # xtrace_disable 00:04:32.895 15:40:53 -- common/autotest_common.sh@10 -- # set +x 00:04:32.895 15:40:53 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:04:32.895 15:40:53 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:32.895 15:40:53 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:32.895 15:40:53 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:04:32.895 15:40:53 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:04:32.895 15:40:53 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:32.895 15:40:53 -- common/autotest_common.sh@1455 -- # uname 00:04:32.895 15:40:53 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:04:32.895 15:40:53 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:32.895 15:40:53 -- common/autotest_common.sh@1475 -- # uname 00:04:32.895 15:40:53 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:04:32.895 15:40:53 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:04:32.895 15:40:53 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:04:32.895 15:40:53 -- spdk/autotest.sh@72 -- # hash lcov 00:04:32.895 15:40:53 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:04:32.895 15:40:53 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:04:32.895 --rc lcov_branch_coverage=1 00:04:32.895 --rc lcov_function_coverage=1 00:04:32.895 --rc genhtml_branch_coverage=1 00:04:32.895 --rc genhtml_function_coverage=1 00:04:32.895 --rc genhtml_legend=1 00:04:32.895 --rc geninfo_all_blocks=1 00:04:32.895 ' 00:04:32.895 15:40:53 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:04:32.895 --rc lcov_branch_coverage=1 00:04:32.895 --rc lcov_function_coverage=1 00:04:32.895 --rc genhtml_branch_coverage=1 00:04:32.895 --rc genhtml_function_coverage=1 00:04:32.895 --rc genhtml_legend=1 00:04:32.895 --rc geninfo_all_blocks=1 00:04:32.895 ' 00:04:32.895 15:40:53 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:04:32.895 --rc lcov_branch_coverage=1 00:04:32.895 --rc lcov_function_coverage=1 00:04:32.895 --rc genhtml_branch_coverage=1 00:04:32.895 --rc genhtml_function_coverage=1 00:04:32.895 --rc genhtml_legend=1 00:04:32.895 --rc geninfo_all_blocks=1 00:04:32.895 --no-external' 00:04:32.895 15:40:53 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:04:32.895 --rc lcov_branch_coverage=1 00:04:32.895 --rc lcov_function_coverage=1 00:04:32.895 --rc genhtml_branch_coverage=1 00:04:32.895 --rc genhtml_function_coverage=1 00:04:32.895 --rc genhtml_legend=1 00:04:32.895 --rc geninfo_all_blocks=1 00:04:32.895 --no-external' 00:04:32.896 15:40:53 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:04:32.896 lcov: LCOV version 1.14 00:04:32.896 15:40:53 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:04:34.279 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:04:34.279 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:04:34.540 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:04:34.540 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:04:34.801 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:04:34.801 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:04:35.063 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:04:35.063 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:47.289 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:04:47.289 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:05:02.196 15:41:21 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:05:02.196 15:41:21 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:02.196 15:41:21 -- common/autotest_common.sh@10 -- # set +x 00:05:02.196 15:41:21 -- spdk/autotest.sh@91 -- # rm -f 00:05:02.196 15:41:21 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:04.744 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:05:04.744 0000:65:00.0 (8086 0a54): Already using the nvme driver 00:05:04.744 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:05:05.004 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:05:05.004 15:41:25 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:05:05.004 15:41:25 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:05.004 15:41:25 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:05.004 15:41:25 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:05.004 15:41:25 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:05.004 15:41:25 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:05.004 15:41:25 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:05.004 15:41:25 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:05.004 15:41:25 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:05.004 15:41:25 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:05:05.004 15:41:25 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:05:05.004 15:41:25 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:05:05.004 15:41:25 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:05:05.004 15:41:25 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:05:05.004 15:41:25 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:05:05.004 No valid GPT data, bailing 00:05:05.004 15:41:25 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:05.004 15:41:25 -- scripts/common.sh@391 -- # pt= 00:05:05.004 15:41:25 -- scripts/common.sh@392 -- # return 1 00:05:05.004 15:41:25 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:05:05.264 1+0 records in 00:05:05.264 1+0 records out 00:05:05.264 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00512165 s, 205 MB/s 00:05:05.264 15:41:25 -- spdk/autotest.sh@118 -- # sync 00:05:05.264 15:41:25 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:05:05.264 15:41:25 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:05:05.264 15:41:25 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:05:13.396 15:41:32 -- spdk/autotest.sh@124 -- # uname -s 00:05:13.396 15:41:32 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:05:13.396 15:41:32 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:05:13.396 15:41:32 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:13.396 15:41:32 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:13.396 15:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:13.396 ************************************ 00:05:13.396 START TEST setup.sh 00:05:13.396 ************************************ 00:05:13.396 15:41:32 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:05:13.396 * Looking for test storage... 00:05:13.396 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:13.396 15:41:32 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:05:13.396 15:41:32 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:05:13.396 15:41:32 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:05:13.396 15:41:32 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:13.396 15:41:32 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:13.396 15:41:32 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:13.396 ************************************ 00:05:13.396 START TEST acl 00:05:13.396 ************************************ 00:05:13.396 15:41:32 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:05:13.396 * Looking for test storage... 00:05:13.396 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:13.396 15:41:33 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:05:13.396 15:41:33 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:05:13.396 15:41:33 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:13.396 15:41:33 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:17.677 15:41:37 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:05:17.677 15:41:37 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:05:17.677 15:41:37 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:17.677 15:41:37 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:05:17.677 15:41:37 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.677 15:41:37 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:05:20.977 Hugepages 00:05:20.977 node hugesize free / total 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 00:05:20.977 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:05:20.977 15:41:41 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:05:20.977 15:41:41 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:20.977 15:41:41 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:20.977 15:41:41 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:21.237 ************************************ 00:05:21.237 START TEST denied 00:05:21.237 ************************************ 00:05:21.237 15:41:41 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:05:21.237 15:41:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:05:21.237 15:41:41 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:05:21.237 15:41:41 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:05:21.237 15:41:41 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.237 15:41:41 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:25.440 0000:65:00.0 (8086 0a54): Skipping denied controller at 0000:65:00.0 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:25.440 15:41:45 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:30.723 00:05:30.723 real 0m9.107s 00:05:30.723 user 0m3.006s 00:05:30.723 sys 0m5.343s 00:05:30.723 15:41:50 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:30.723 15:41:50 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:05:30.723 ************************************ 00:05:30.723 END TEST denied 00:05:30.723 ************************************ 00:05:30.723 15:41:50 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:30.723 15:41:50 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:05:30.723 15:41:50 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:30.723 15:41:50 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:30.723 15:41:50 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:30.723 ************************************ 00:05:30.723 START TEST allowed 00:05:30.723 ************************************ 00:05:30.723 15:41:50 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:05:30.723 15:41:50 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:05:30.723 15:41:50 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:05:30.723 15:41:50 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:05:30.723 15:41:50 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:05:30.723 15:41:50 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:37.303 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:37.303 15:41:56 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:05:37.303 15:41:56 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:05:37.303 15:41:56 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:05:37.303 15:41:56 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:37.303 15:41:56 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:40.600 00:05:40.600 real 0m10.228s 00:05:40.600 user 0m3.033s 00:05:40.600 sys 0m5.472s 00:05:40.600 15:42:00 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:40.600 15:42:00 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:05:40.600 ************************************ 00:05:40.600 END TEST allowed 00:05:40.600 ************************************ 00:05:40.600 15:42:00 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:05:40.600 00:05:40.600 real 0m27.962s 00:05:40.600 user 0m9.232s 00:05:40.600 sys 0m16.479s 00:05:40.600 15:42:00 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:40.600 15:42:00 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:05:40.600 ************************************ 00:05:40.600 END TEST acl 00:05:40.600 ************************************ 00:05:40.600 15:42:00 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:40.600 15:42:00 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:40.600 15:42:00 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:40.600 15:42:00 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:40.600 15:42:00 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:40.600 ************************************ 00:05:40.600 START TEST hugepages 00:05:40.600 ************************************ 00:05:40.600 15:42:00 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:05:40.861 * Looking for test storage... 00:05:40.861 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 108538848 kB' 'MemAvailable: 111669236 kB' 'Buffers: 11424 kB' 'Cached: 9239484 kB' 'SwapCached: 0 kB' 'Active: 6314500 kB' 'Inactive: 3442756 kB' 'Active(anon): 5924008 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 509804 kB' 'Mapped: 178272 kB' 'Shmem: 5417660 kB' 'KReclaimable: 242964 kB' 'Slab: 833320 kB' 'SReclaimable: 242964 kB' 'SUnreclaim: 590356 kB' 'KernelStack: 25200 kB' 'PageTables: 9008 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69463468 kB' 'Committed_AS: 7467896 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226840 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.861 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.862 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:40.863 15:42:01 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:05:40.863 15:42:01 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:40.863 15:42:01 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:40.863 15:42:01 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:40.863 ************************************ 00:05:40.863 START TEST default_setup 00:05:40.863 ************************************ 00:05:40.863 15:42:01 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:05:40.863 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:05:40.863 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:05:40.863 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:40.863 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.864 15:42:01 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:45.108 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:05:45.108 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:05:47.023 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110722660 kB' 'MemAvailable: 113852592 kB' 'Buffers: 11424 kB' 'Cached: 9239632 kB' 'SwapCached: 0 kB' 'Active: 6330272 kB' 'Inactive: 3442756 kB' 'Active(anon): 5939780 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525428 kB' 'Mapped: 178640 kB' 'Shmem: 5417808 kB' 'KReclaimable: 242052 kB' 'Slab: 831264 kB' 'SReclaimable: 242052 kB' 'SUnreclaim: 589212 kB' 'KernelStack: 25056 kB' 'PageTables: 8656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7486692 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226888 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.023 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110724824 kB' 'MemAvailable: 113854756 kB' 'Buffers: 11424 kB' 'Cached: 9239632 kB' 'SwapCached: 0 kB' 'Active: 6329872 kB' 'Inactive: 3442756 kB' 'Active(anon): 5939380 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525064 kB' 'Mapped: 178576 kB' 'Shmem: 5417808 kB' 'KReclaimable: 242052 kB' 'Slab: 831232 kB' 'SReclaimable: 242052 kB' 'SUnreclaim: 589180 kB' 'KernelStack: 25040 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7486708 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226840 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.024 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110724160 kB' 'MemAvailable: 113854092 kB' 'Buffers: 11424 kB' 'Cached: 9239652 kB' 'SwapCached: 0 kB' 'Active: 6329828 kB' 'Inactive: 3442756 kB' 'Active(anon): 5939336 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524968 kB' 'Mapped: 178516 kB' 'Shmem: 5417828 kB' 'KReclaimable: 242052 kB' 'Slab: 831340 kB' 'SReclaimable: 242052 kB' 'SUnreclaim: 589288 kB' 'KernelStack: 25024 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7486732 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226840 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.025 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.026 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:47.027 nr_hugepages=1024 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:47.027 resv_hugepages=0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:47.027 surplus_hugepages=0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:47.027 anon_hugepages=0 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110718616 kB' 'MemAvailable: 113848548 kB' 'Buffers: 11424 kB' 'Cached: 9239672 kB' 'SwapCached: 0 kB' 'Active: 6330992 kB' 'Inactive: 3442756 kB' 'Active(anon): 5940500 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 526120 kB' 'Mapped: 178516 kB' 'Shmem: 5417848 kB' 'KReclaimable: 242052 kB' 'Slab: 831340 kB' 'SReclaimable: 242052 kB' 'SUnreclaim: 589288 kB' 'KernelStack: 25056 kB' 'PageTables: 8712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7502256 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226824 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.027 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61094260 kB' 'MemUsed: 4567740 kB' 'SwapCached: 0 kB' 'Active: 1433464 kB' 'Inactive: 121196 kB' 'Active(anon): 1118300 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268560 kB' 'Mapped: 117780 kB' 'AnonPages: 289260 kB' 'Shmem: 832200 kB' 'KernelStack: 13384 kB' 'PageTables: 5264 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124728 kB' 'Slab: 378020 kB' 'SReclaimable: 124728 kB' 'SUnreclaim: 253292 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.028 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:47.029 node0=1024 expecting 1024 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:47.029 00:05:47.029 real 0m6.157s 00:05:47.029 user 0m1.654s 00:05:47.029 sys 0m2.669s 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:47.029 15:42:07 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:05:47.029 ************************************ 00:05:47.029 END TEST default_setup 00:05:47.029 ************************************ 00:05:47.029 15:42:07 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:47.029 15:42:07 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:05:47.029 15:42:07 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:47.029 15:42:07 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:47.029 15:42:07 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:47.029 ************************************ 00:05:47.029 START TEST per_node_1G_alloc 00:05:47.029 ************************************ 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:47.029 15:42:07 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:51.270 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:51.270 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110719816 kB' 'MemAvailable: 113849720 kB' 'Buffers: 11424 kB' 'Cached: 9239804 kB' 'SwapCached: 0 kB' 'Active: 6325932 kB' 'Inactive: 3442756 kB' 'Active(anon): 5935440 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520488 kB' 'Mapped: 177044 kB' 'Shmem: 5417980 kB' 'KReclaimable: 241996 kB' 'Slab: 830864 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 588868 kB' 'KernelStack: 25168 kB' 'PageTables: 8240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7466480 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226996 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.270 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.271 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110719020 kB' 'MemAvailable: 113848924 kB' 'Buffers: 11424 kB' 'Cached: 9239804 kB' 'SwapCached: 0 kB' 'Active: 6327008 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936516 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521580 kB' 'Mapped: 177036 kB' 'Shmem: 5417980 kB' 'KReclaimable: 241996 kB' 'Slab: 830824 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 588828 kB' 'KernelStack: 25200 kB' 'PageTables: 8948 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7468480 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 227124 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.272 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:51.273 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110722092 kB' 'MemAvailable: 113851996 kB' 'Buffers: 11424 kB' 'Cached: 9239804 kB' 'SwapCached: 0 kB' 'Active: 6327936 kB' 'Inactive: 3442756 kB' 'Active(anon): 5937444 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522508 kB' 'Mapped: 176984 kB' 'Shmem: 5417980 kB' 'KReclaimable: 241996 kB' 'Slab: 830820 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 588824 kB' 'KernelStack: 25360 kB' 'PageTables: 9140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7470904 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 227076 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.274 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:51.275 nr_hugepages=1024 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:51.275 resv_hugepages=0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:51.275 surplus_hugepages=0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:51.275 anon_hugepages=0 00:05:51.275 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110722668 kB' 'MemAvailable: 113852572 kB' 'Buffers: 11424 kB' 'Cached: 9239804 kB' 'SwapCached: 0 kB' 'Active: 6327476 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936984 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522472 kB' 'Mapped: 176892 kB' 'Shmem: 5417980 kB' 'KReclaimable: 241996 kB' 'Slab: 830824 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 588828 kB' 'KernelStack: 25312 kB' 'PageTables: 9128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467928 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 227092 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.276 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.277 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62160860 kB' 'MemUsed: 3501140 kB' 'SwapCached: 0 kB' 'Active: 1430776 kB' 'Inactive: 121196 kB' 'Active(anon): 1115612 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268656 kB' 'Mapped: 117556 kB' 'AnonPages: 286440 kB' 'Shmem: 832296 kB' 'KernelStack: 13608 kB' 'PageTables: 5932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 377800 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 253120 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.278 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48554480 kB' 'MemUsed: 12127556 kB' 'SwapCached: 0 kB' 'Active: 4899960 kB' 'Inactive: 3321560 kB' 'Active(anon): 4824632 kB' 'Inactive(anon): 0 kB' 'Active(file): 75328 kB' 'Inactive(file): 3321560 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7982660 kB' 'Mapped: 59848 kB' 'AnonPages: 239008 kB' 'Shmem: 4585772 kB' 'KernelStack: 11592 kB' 'PageTables: 2816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 117316 kB' 'Slab: 453004 kB' 'SReclaimable: 117316 kB' 'SUnreclaim: 335688 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.279 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:51.280 node0=512 expecting 512 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:51.280 node1=512 expecting 512 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:51.280 00:05:51.280 real 0m4.173s 00:05:51.280 user 0m1.625s 00:05:51.280 sys 0m2.621s 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:51.280 15:42:11 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:51.280 ************************************ 00:05:51.280 END TEST per_node_1G_alloc 00:05:51.280 ************************************ 00:05:51.280 15:42:11 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:51.280 15:42:11 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:05:51.280 15:42:11 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:51.280 15:42:11 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:51.280 15:42:11 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:51.280 ************************************ 00:05:51.280 START TEST even_2G_alloc 00:05:51.280 ************************************ 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:51.280 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:51.281 15:42:11 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:55.491 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:55.491 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110719952 kB' 'MemAvailable: 113849856 kB' 'Buffers: 11424 kB' 'Cached: 9239984 kB' 'SwapCached: 0 kB' 'Active: 6323900 kB' 'Inactive: 3442756 kB' 'Active(anon): 5933408 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 517992 kB' 'Mapped: 177052 kB' 'Shmem: 5418160 kB' 'KReclaimable: 241996 kB' 'Slab: 831376 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589380 kB' 'KernelStack: 24896 kB' 'PageTables: 7768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7461268 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226772 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.491 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110720732 kB' 'MemAvailable: 113850636 kB' 'Buffers: 11424 kB' 'Cached: 9239988 kB' 'SwapCached: 0 kB' 'Active: 6324060 kB' 'Inactive: 3442756 kB' 'Active(anon): 5933568 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518128 kB' 'Mapped: 177012 kB' 'Shmem: 5418164 kB' 'KReclaimable: 241996 kB' 'Slab: 831376 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589380 kB' 'KernelStack: 24880 kB' 'PageTables: 7712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7461292 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226740 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.492 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.493 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110721672 kB' 'MemAvailable: 113851576 kB' 'Buffers: 11424 kB' 'Cached: 9240004 kB' 'SwapCached: 0 kB' 'Active: 6323544 kB' 'Inactive: 3442756 kB' 'Active(anon): 5933052 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518140 kB' 'Mapped: 176932 kB' 'Shmem: 5418180 kB' 'KReclaimable: 241996 kB' 'Slab: 831360 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589364 kB' 'KernelStack: 24896 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7461448 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226740 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.494 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.495 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:55.496 nr_hugepages=1024 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:55.496 resv_hugepages=0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:55.496 surplus_hugepages=0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:55.496 anon_hugepages=0 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110722588 kB' 'MemAvailable: 113852492 kB' 'Buffers: 11424 kB' 'Cached: 9240036 kB' 'SwapCached: 0 kB' 'Active: 6323956 kB' 'Inactive: 3442756 kB' 'Active(anon): 5933464 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518528 kB' 'Mapped: 176932 kB' 'Shmem: 5418212 kB' 'KReclaimable: 241996 kB' 'Slab: 831360 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589364 kB' 'KernelStack: 24944 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7461836 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226756 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.496 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.497 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62173808 kB' 'MemUsed: 3488192 kB' 'SwapCached: 0 kB' 'Active: 1429928 kB' 'Inactive: 121196 kB' 'Active(anon): 1114764 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268708 kB' 'Mapped: 117576 kB' 'AnonPages: 285596 kB' 'Shmem: 832348 kB' 'KernelStack: 13416 kB' 'PageTables: 5388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 378268 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 253588 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.498 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48548664 kB' 'MemUsed: 12133372 kB' 'SwapCached: 0 kB' 'Active: 4894028 kB' 'Inactive: 3321560 kB' 'Active(anon): 4818700 kB' 'Inactive(anon): 0 kB' 'Active(file): 75328 kB' 'Inactive(file): 3321560 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7982792 kB' 'Mapped: 59356 kB' 'AnonPages: 232904 kB' 'Shmem: 4585904 kB' 'KernelStack: 11512 kB' 'PageTables: 2552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 117316 kB' 'Slab: 453092 kB' 'SReclaimable: 117316 kB' 'SUnreclaim: 335776 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.499 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:55.500 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:55.501 node0=512 expecting 512 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:05:55.501 node1=512 expecting 512 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:05:55.501 00:05:55.501 real 0m4.143s 00:05:55.501 user 0m1.666s 00:05:55.501 sys 0m2.547s 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:55.501 15:42:15 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:55.501 ************************************ 00:05:55.501 END TEST even_2G_alloc 00:05:55.501 ************************************ 00:05:55.501 15:42:15 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:55.501 15:42:15 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:05:55.501 15:42:15 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:55.501 15:42:15 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:55.501 15:42:15 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:55.501 ************************************ 00:05:55.501 START TEST odd_alloc 00:05:55.501 ************************************ 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:55.501 15:42:15 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:59.711 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:59.711 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110741904 kB' 'MemAvailable: 113871808 kB' 'Buffers: 11424 kB' 'Cached: 9240164 kB' 'SwapCached: 0 kB' 'Active: 6325288 kB' 'Inactive: 3442756 kB' 'Active(anon): 5934796 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519304 kB' 'Mapped: 177096 kB' 'Shmem: 5418340 kB' 'KReclaimable: 241996 kB' 'Slab: 832032 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590036 kB' 'KernelStack: 24960 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7462592 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226820 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.711 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110741892 kB' 'MemAvailable: 113871796 kB' 'Buffers: 11424 kB' 'Cached: 9240168 kB' 'SwapCached: 0 kB' 'Active: 6324672 kB' 'Inactive: 3442756 kB' 'Active(anon): 5934180 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519164 kB' 'Mapped: 176948 kB' 'Shmem: 5418344 kB' 'KReclaimable: 241996 kB' 'Slab: 832016 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590020 kB' 'KernelStack: 24944 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7462608 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226788 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.712 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110742640 kB' 'MemAvailable: 113872544 kB' 'Buffers: 11424 kB' 'Cached: 9240184 kB' 'SwapCached: 0 kB' 'Active: 6324688 kB' 'Inactive: 3442756 kB' 'Active(anon): 5934196 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519164 kB' 'Mapped: 176948 kB' 'Shmem: 5418360 kB' 'KReclaimable: 241996 kB' 'Slab: 832016 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590020 kB' 'KernelStack: 24944 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7462628 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226788 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.713 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:59.714 nr_hugepages=1025 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:59.714 resv_hugepages=0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:59.714 surplus_hugepages=0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:59.714 anon_hugepages=0 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110743952 kB' 'MemAvailable: 113873856 kB' 'Buffers: 11424 kB' 'Cached: 9240204 kB' 'SwapCached: 0 kB' 'Active: 6324712 kB' 'Inactive: 3442756 kB' 'Active(anon): 5934220 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519164 kB' 'Mapped: 176948 kB' 'Shmem: 5418380 kB' 'KReclaimable: 241996 kB' 'Slab: 832016 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590020 kB' 'KernelStack: 24944 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7462652 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226788 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.714 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62171548 kB' 'MemUsed: 3490452 kB' 'SwapCached: 0 kB' 'Active: 1429916 kB' 'Inactive: 121196 kB' 'Active(anon): 1114752 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268752 kB' 'Mapped: 117604 kB' 'AnonPages: 285532 kB' 'Shmem: 832392 kB' 'KernelStack: 13384 kB' 'PageTables: 5288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 378672 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 253992 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:59.715 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 48573524 kB' 'MemUsed: 12108512 kB' 'SwapCached: 0 kB' 'Active: 4894684 kB' 'Inactive: 3321560 kB' 'Active(anon): 4819356 kB' 'Inactive(anon): 0 kB' 'Active(file): 75328 kB' 'Inactive(file): 3321560 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7982916 kB' 'Mapped: 59344 kB' 'AnonPages: 233384 kB' 'Shmem: 4586028 kB' 'KernelStack: 11528 kB' 'PageTables: 2600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 117316 kB' 'Slab: 453344 kB' 'SReclaimable: 117316 kB' 'SUnreclaim: 336028 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:19 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:59.716 node0=512 expecting 513 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:59.716 node1=513 expecting 512 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:59.716 00:05:59.716 real 0m4.120s 00:05:59.716 user 0m1.623s 00:05:59.716 sys 0m2.568s 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:59.716 15:42:20 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:59.716 ************************************ 00:05:59.716 END TEST odd_alloc 00:05:59.716 ************************************ 00:05:59.716 15:42:20 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:59.716 15:42:20 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:59.716 15:42:20 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:59.716 15:42:20 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:59.716 15:42:20 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:59.716 ************************************ 00:05:59.716 START TEST custom_alloc 00:05:59.716 ************************************ 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:59.716 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:59.717 15:42:20 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:03.922 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:06:03.922 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109719224 kB' 'MemAvailable: 112849128 kB' 'Buffers: 11424 kB' 'Cached: 9240348 kB' 'SwapCached: 0 kB' 'Active: 6331700 kB' 'Inactive: 3442756 kB' 'Active(anon): 5941208 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525396 kB' 'Mapped: 177576 kB' 'Shmem: 5418524 kB' 'KReclaimable: 241996 kB' 'Slab: 832084 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590088 kB' 'KernelStack: 24976 kB' 'PageTables: 8100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7469780 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226808 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.922 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.923 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109721052 kB' 'MemAvailable: 112850956 kB' 'Buffers: 11424 kB' 'Cached: 9240352 kB' 'SwapCached: 0 kB' 'Active: 6326320 kB' 'Inactive: 3442756 kB' 'Active(anon): 5935828 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520164 kB' 'Mapped: 177020 kB' 'Shmem: 5418528 kB' 'KReclaimable: 241996 kB' 'Slab: 832076 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590080 kB' 'KernelStack: 24976 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7465180 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226772 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.924 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.925 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109720580 kB' 'MemAvailable: 112850484 kB' 'Buffers: 11424 kB' 'Cached: 9240368 kB' 'SwapCached: 0 kB' 'Active: 6325556 kB' 'Inactive: 3442756 kB' 'Active(anon): 5935064 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519848 kB' 'Mapped: 176972 kB' 'Shmem: 5418544 kB' 'KReclaimable: 241996 kB' 'Slab: 832072 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590076 kB' 'KernelStack: 24928 kB' 'PageTables: 7944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7464832 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226756 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.926 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.927 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:06:03.928 nr_hugepages=1536 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:03.928 resv_hugepages=0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:03.928 surplus_hugepages=0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:03.928 anon_hugepages=0 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 109719948 kB' 'MemAvailable: 112849852 kB' 'Buffers: 11424 kB' 'Cached: 9240388 kB' 'SwapCached: 0 kB' 'Active: 6325332 kB' 'Inactive: 3442756 kB' 'Active(anon): 5934840 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 519620 kB' 'Mapped: 176972 kB' 'Shmem: 5418564 kB' 'KReclaimable: 241996 kB' 'Slab: 832072 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 590076 kB' 'KernelStack: 24976 kB' 'PageTables: 7680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7464852 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226772 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.928 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:03.929 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 62182628 kB' 'MemUsed: 3479372 kB' 'SwapCached: 0 kB' 'Active: 1430056 kB' 'Inactive: 121196 kB' 'Active(anon): 1114892 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268860 kB' 'Mapped: 117632 kB' 'AnonPages: 285548 kB' 'Shmem: 832500 kB' 'KernelStack: 13384 kB' 'PageTables: 5292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 378648 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 253968 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.930 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682036 kB' 'MemFree: 47538920 kB' 'MemUsed: 13143116 kB' 'SwapCached: 0 kB' 'Active: 4895204 kB' 'Inactive: 3321560 kB' 'Active(anon): 4819876 kB' 'Inactive(anon): 0 kB' 'Active(file): 75328 kB' 'Inactive(file): 3321560 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 7982976 kB' 'Mapped: 59340 kB' 'AnonPages: 233924 kB' 'Shmem: 4586088 kB' 'KernelStack: 11544 kB' 'PageTables: 2232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 117316 kB' 'Slab: 453424 kB' 'SReclaimable: 117316 kB' 'SUnreclaim: 336108 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.931 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.932 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:06:03.933 node0=512 expecting 512 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:06:03.933 node1=1024 expecting 1024 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:06:03.933 00:06:03.933 real 0m4.149s 00:06:03.933 user 0m1.656s 00:06:03.933 sys 0m2.566s 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:03.933 15:42:24 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:03.933 ************************************ 00:06:03.933 END TEST custom_alloc 00:06:03.933 ************************************ 00:06:03.933 15:42:24 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:03.933 15:42:24 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:06:03.933 15:42:24 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:03.933 15:42:24 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:03.933 15:42:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:03.933 ************************************ 00:06:03.933 START TEST no_shrink_alloc 00:06:03.933 ************************************ 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:03.933 15:42:24 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:08.145 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:06:08.145 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110754392 kB' 'MemAvailable: 113884296 kB' 'Buffers: 11424 kB' 'Cached: 9240512 kB' 'SwapCached: 0 kB' 'Active: 6327116 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936624 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520608 kB' 'Mapped: 177116 kB' 'Shmem: 5418688 kB' 'KReclaimable: 241996 kB' 'Slab: 831940 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589944 kB' 'KernelStack: 24976 kB' 'PageTables: 7992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467228 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226964 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.145 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.146 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110753768 kB' 'MemAvailable: 113883672 kB' 'Buffers: 11424 kB' 'Cached: 9240516 kB' 'SwapCached: 0 kB' 'Active: 6326772 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936280 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520272 kB' 'Mapped: 177068 kB' 'Shmem: 5418692 kB' 'KReclaimable: 241996 kB' 'Slab: 831924 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589928 kB' 'KernelStack: 24992 kB' 'PageTables: 8184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467244 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226948 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.147 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:08.148 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110752392 kB' 'MemAvailable: 113882296 kB' 'Buffers: 11424 kB' 'Cached: 9240536 kB' 'SwapCached: 0 kB' 'Active: 6326556 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936064 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520516 kB' 'Mapped: 176992 kB' 'Shmem: 5418712 kB' 'KReclaimable: 241996 kB' 'Slab: 831904 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589908 kB' 'KernelStack: 25072 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467268 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226996 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.149 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:08.150 nr_hugepages=1024 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:08.150 resv_hugepages=0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:08.150 surplus_hugepages=0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:08.150 anon_hugepages=0 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:08.150 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110752260 kB' 'MemAvailable: 113882164 kB' 'Buffers: 11424 kB' 'Cached: 9240556 kB' 'SwapCached: 0 kB' 'Active: 6326264 kB' 'Inactive: 3442756 kB' 'Active(anon): 5935772 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520212 kB' 'Mapped: 176992 kB' 'Shmem: 5418732 kB' 'KReclaimable: 241996 kB' 'Slab: 831904 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589908 kB' 'KernelStack: 24992 kB' 'PageTables: 8112 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467288 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226996 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.151 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.152 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61151632 kB' 'MemUsed: 4510368 kB' 'SwapCached: 0 kB' 'Active: 1429500 kB' 'Inactive: 121196 kB' 'Active(anon): 1114336 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1268996 kB' 'Mapped: 117652 kB' 'AnonPages: 284804 kB' 'Shmem: 832636 kB' 'KernelStack: 13368 kB' 'PageTables: 5096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 378804 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 254124 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.153 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:06:08.154 node0=1024 expecting 1024 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:06:08.154 15:42:28 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:12.365 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:06:12.365 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:06:12.365 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.365 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110773800 kB' 'MemAvailable: 113903704 kB' 'Buffers: 11424 kB' 'Cached: 9240664 kB' 'SwapCached: 0 kB' 'Active: 6327372 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936880 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520748 kB' 'Mapped: 177136 kB' 'Shmem: 5418840 kB' 'KReclaimable: 241996 kB' 'Slab: 831628 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589632 kB' 'KernelStack: 24976 kB' 'PageTables: 7660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7467860 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226852 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.366 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110773864 kB' 'MemAvailable: 113903768 kB' 'Buffers: 11424 kB' 'Cached: 9240668 kB' 'SwapCached: 0 kB' 'Active: 6327372 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936880 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521264 kB' 'Mapped: 177060 kB' 'Shmem: 5418844 kB' 'KReclaimable: 241996 kB' 'Slab: 831840 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589844 kB' 'KernelStack: 25056 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7468012 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226836 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.367 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.368 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110774448 kB' 'MemAvailable: 113904352 kB' 'Buffers: 11424 kB' 'Cached: 9240684 kB' 'SwapCached: 0 kB' 'Active: 6326644 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936152 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520516 kB' 'Mapped: 177004 kB' 'Shmem: 5418860 kB' 'KReclaimable: 241996 kB' 'Slab: 831840 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589844 kB' 'KernelStack: 24928 kB' 'PageTables: 7804 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7465040 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226900 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.369 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.370 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:06:12.371 nr_hugepages=1024 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:06:12.371 resv_hugepages=0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:06:12.371 surplus_hugepages=0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:06:12.371 anon_hugepages=0 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344036 kB' 'MemFree: 110775404 kB' 'MemAvailable: 113905308 kB' 'Buffers: 11424 kB' 'Cached: 9240724 kB' 'SwapCached: 0 kB' 'Active: 6327048 kB' 'Inactive: 3442756 kB' 'Active(anon): 5936556 kB' 'Inactive(anon): 0 kB' 'Active(file): 390492 kB' 'Inactive(file): 3442756 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 520988 kB' 'Mapped: 177020 kB' 'Shmem: 5418900 kB' 'KReclaimable: 241996 kB' 'Slab: 831840 kB' 'SReclaimable: 241996 kB' 'SUnreclaim: 589844 kB' 'KernelStack: 24848 kB' 'PageTables: 7612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7465564 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 226804 kB' 'VmallocChunk: 0 kB' 'Percpu: 86528 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 592164 kB' 'DirectMap2M: 15865856 kB' 'DirectMap1G: 119537664 kB' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.371 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.372 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 61167716 kB' 'MemUsed: 4494284 kB' 'SwapCached: 0 kB' 'Active: 1431144 kB' 'Inactive: 121196 kB' 'Active(anon): 1115980 kB' 'Inactive(anon): 0 kB' 'Active(file): 315164 kB' 'Inactive(file): 121196 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1269160 kB' 'Mapped: 117664 kB' 'AnonPages: 286308 kB' 'Shmem: 832800 kB' 'KernelStack: 13384 kB' 'PageTables: 5292 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 124680 kB' 'Slab: 378724 kB' 'SReclaimable: 124680 kB' 'SUnreclaim: 254044 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.373 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:06:12.374 node0=1024 expecting 1024 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:06:12.374 00:06:12.374 real 0m8.219s 00:06:12.374 user 0m3.160s 00:06:12.374 sys 0m5.203s 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.374 15:42:32 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:06:12.374 ************************************ 00:06:12.374 END TEST no_shrink_alloc 00:06:12.374 ************************************ 00:06:12.374 15:42:32 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:06:12.374 15:42:32 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:06:12.374 00:06:12.374 real 0m31.624s 00:06:12.374 user 0m11.626s 00:06:12.374 sys 0m18.631s 00:06:12.374 15:42:32 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:12.374 15:42:32 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:06:12.374 ************************************ 00:06:12.374 END TEST hugepages 00:06:12.374 ************************************ 00:06:12.374 15:42:32 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:12.374 15:42:32 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:06:12.374 15:42:32 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:12.374 15:42:32 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:12.374 15:42:32 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:12.374 ************************************ 00:06:12.374 START TEST driver 00:06:12.374 ************************************ 00:06:12.374 15:42:32 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:06:12.374 * Looking for test storage... 00:06:12.374 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:12.374 15:42:32 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:06:12.374 15:42:32 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:12.374 15:42:32 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:17.734 15:42:38 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:06:17.734 15:42:38 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:17.734 15:42:38 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:17.734 15:42:38 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:17.734 ************************************ 00:06:17.734 START TEST guess_driver 00:06:17.734 ************************************ 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 370 > 0 )) 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:06:17.734 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:06:17.734 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:06:17.734 Looking for driver=vfio-pci 00:06:17.735 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:17.735 15:42:38 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:06:17.735 15:42:38 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:06:17.735 15:42:38 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:41 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:21.940 15:42:42 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:23.852 15:42:43 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:29.135 00:06:29.135 real 0m11.170s 00:06:29.135 user 0m3.103s 00:06:29.135 sys 0m5.545s 00:06:29.136 15:42:49 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:29.136 15:42:49 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:06:29.136 ************************************ 00:06:29.136 END TEST guess_driver 00:06:29.136 ************************************ 00:06:29.136 15:42:49 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:06:29.136 00:06:29.136 real 0m16.613s 00:06:29.136 user 0m4.741s 00:06:29.136 sys 0m8.534s 00:06:29.136 15:42:49 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:29.136 15:42:49 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:06:29.136 ************************************ 00:06:29.136 END TEST driver 00:06:29.136 ************************************ 00:06:29.136 15:42:49 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:29.136 15:42:49 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:29.136 15:42:49 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:29.136 15:42:49 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:29.136 15:42:49 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:29.136 ************************************ 00:06:29.136 START TEST devices 00:06:29.136 ************************************ 00:06:29.136 15:42:49 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:06:29.136 * Looking for test storage... 00:06:29.136 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:06:29.136 15:42:49 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:06:29.136 15:42:49 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:06:29.136 15:42:49 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:06:29.136 15:42:49 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:06:33.346 15:42:53 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:06:33.346 No valid GPT data, bailing 00:06:33.346 15:42:53 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:06:33.346 15:42:53 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:06:33.346 15:42:53 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:06:33.346 15:42:53 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:33.346 15:42:53 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:33.608 ************************************ 00:06:33.608 START TEST nvme_mount 00:06:33.608 ************************************ 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:33.608 15:42:53 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:06:34.550 Creating new GPT entries in memory. 00:06:34.550 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:34.550 other utilities. 00:06:34.550 15:42:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:34.550 15:42:54 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:34.550 15:42:54 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:34.550 15:42:54 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:34.550 15:42:54 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:35.493 Creating new GPT entries in memory. 00:06:35.493 The operation has completed successfully. 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2431159 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:35.493 15:42:55 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.697 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:39.698 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:39.698 15:42:59 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:39.698 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:39.698 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:06:39.698 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:39.698 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:39.698 15:43:00 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.901 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.902 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:43.902 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.902 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:43.902 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:06:43.902 15:43:03 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:43.902 15:43:04 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:47.202 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:47.463 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:47.463 00:06:47.463 real 0m13.959s 00:06:47.463 user 0m4.218s 00:06:47.463 sys 0m7.599s 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:47.463 15:43:07 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:06:47.463 ************************************ 00:06:47.463 END TEST nvme_mount 00:06:47.463 ************************************ 00:06:47.463 15:43:07 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:47.463 15:43:07 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:06:47.463 15:43:07 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:47.463 15:43:07 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:47.463 15:43:07 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:47.463 ************************************ 00:06:47.463 START TEST dm_mount 00:06:47.463 ************************************ 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:06:47.463 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:06:47.464 15:43:07 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:06:48.880 Creating new GPT entries in memory. 00:06:48.881 GPT data structures destroyed! You may now partition the disk using fdisk or 00:06:48.881 other utilities. 00:06:48.881 15:43:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:06:48.881 15:43:08 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:48.881 15:43:08 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:48.881 15:43:08 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:48.881 15:43:08 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:06:49.452 Creating new GPT entries in memory. 00:06:49.452 The operation has completed successfully. 00:06:49.452 15:43:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:49.452 15:43:09 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:49.452 15:43:09 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:06:49.452 15:43:09 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:06:49.452 15:43:09 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:06:50.832 The operation has completed successfully. 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2436291 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:50.832 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:06:50.833 15:43:10 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:50.833 15:43:11 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.030 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:06:55.031 15:43:14 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:58.324 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:58.584 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:58.584 00:06:58.584 real 0m11.062s 00:06:58.584 user 0m2.970s 00:06:58.584 sys 0m5.152s 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.584 15:43:18 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:58.584 ************************************ 00:06:58.584 END TEST dm_mount 00:06:58.584 ************************************ 00:06:58.584 15:43:18 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:58.584 15:43:18 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:58.845 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:58.845 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:06:58.845 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:58.845 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:58.845 15:43:19 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:58.845 00:06:58.845 real 0m29.869s 00:06:58.845 user 0m8.845s 00:06:58.845 sys 0m15.828s 00:06:58.845 15:43:19 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.845 15:43:19 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:58.845 ************************************ 00:06:58.845 END TEST devices 00:06:58.845 ************************************ 00:06:58.845 15:43:19 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:06:58.845 00:06:58.845 real 1m46.503s 00:06:58.845 user 0m34.613s 00:06:58.845 sys 0m59.765s 00:06:58.845 15:43:19 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.845 15:43:19 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:58.845 ************************************ 00:06:58.845 END TEST setup.sh 00:06:58.845 ************************************ 00:06:59.106 15:43:19 -- common/autotest_common.sh@1142 -- # return 0 00:06:59.106 15:43:19 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:07:03.309 Hugepages 00:07:03.309 node hugesize free / total 00:07:03.309 node0 1048576kB 0 / 0 00:07:03.309 node0 2048kB 1024 / 1024 00:07:03.309 node1 1048576kB 0 / 0 00:07:03.309 node1 2048kB 1024 / 1024 00:07:03.309 00:07:03.309 Type BDF Vendor Device NUMA Driver Device Block devices 00:07:03.309 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:07:03.309 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:07:03.309 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:07:03.309 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:07:03.309 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:07:03.309 15:43:23 -- spdk/autotest.sh@130 -- # uname -s 00:07:03.309 15:43:23 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:07:03.309 15:43:23 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:07:03.309 15:43:23 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:07.513 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:07:07.513 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:07:08.896 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:07:08.896 15:43:29 -- common/autotest_common.sh@1532 -- # sleep 1 00:07:10.280 15:43:30 -- common/autotest_common.sh@1533 -- # bdfs=() 00:07:10.280 15:43:30 -- common/autotest_common.sh@1533 -- # local bdfs 00:07:10.280 15:43:30 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:07:10.280 15:43:30 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:07:10.280 15:43:30 -- common/autotest_common.sh@1513 -- # bdfs=() 00:07:10.280 15:43:30 -- common/autotest_common.sh@1513 -- # local bdfs 00:07:10.280 15:43:30 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:10.280 15:43:30 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:10.280 15:43:30 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:07:10.280 15:43:30 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:07:10.280 15:43:30 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:07:10.280 15:43:30 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:07:14.485 Waiting for block devices as requested 00:07:14.485 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:07:14.485 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:07:14.746 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:07:14.746 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:07:14.746 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:07:15.006 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:07:15.006 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:07:15.006 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:07:15.271 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:07:15.271 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:07:15.271 15:43:35 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:07:15.271 15:43:35 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1502 -- # grep 0000:65:00.0/nvme/nvme 00:07:15.271 15:43:35 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:07:15.271 15:43:35 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:07:15.271 15:43:35 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1545 -- # grep oacs 00:07:15.271 15:43:35 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:07:15.271 15:43:35 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:07:15.271 15:43:35 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:07:15.271 15:43:35 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:07:15.271 15:43:35 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:07:15.271 15:43:35 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:07:15.271 15:43:35 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:07:15.272 15:43:35 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:07:15.272 15:43:35 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:07:15.272 15:43:35 -- common/autotest_common.sh@1557 -- # continue 00:07:15.272 15:43:35 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:07:15.272 15:43:35 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:15.272 15:43:35 -- common/autotest_common.sh@10 -- # set +x 00:07:15.535 15:43:35 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:07:15.535 15:43:35 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:15.535 15:43:35 -- common/autotest_common.sh@10 -- # set +x 00:07:15.535 15:43:35 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:07:19.810 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:07:19.810 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:07:21.193 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:07:21.454 15:43:41 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:07:21.454 15:43:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:21.454 15:43:41 -- common/autotest_common.sh@10 -- # set +x 00:07:21.454 15:43:41 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:07:21.454 15:43:41 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:07:21.454 15:43:41 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:07:21.454 15:43:41 -- common/autotest_common.sh@1577 -- # bdfs=() 00:07:21.454 15:43:41 -- common/autotest_common.sh@1577 -- # local bdfs 00:07:21.454 15:43:41 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:07:21.454 15:43:41 -- common/autotest_common.sh@1513 -- # bdfs=() 00:07:21.454 15:43:41 -- common/autotest_common.sh@1513 -- # local bdfs 00:07:21.454 15:43:41 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:21.454 15:43:41 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:07:21.454 15:43:41 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:07:21.454 15:43:41 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:07:21.454 15:43:41 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:07:21.454 15:43:41 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:07:21.454 15:43:41 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:07:21.454 15:43:41 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:07:21.454 15:43:41 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:07:21.454 15:43:41 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:07:21.454 15:43:41 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:65:00.0 00:07:21.454 15:43:41 -- common/autotest_common.sh@1592 -- # [[ -z 0000:65:00.0 ]] 00:07:21.454 15:43:41 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=2447508 00:07:21.454 15:43:41 -- common/autotest_common.sh@1598 -- # waitforlisten 2447508 00:07:21.454 15:43:41 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:21.454 15:43:41 -- common/autotest_common.sh@829 -- # '[' -z 2447508 ']' 00:07:21.454 15:43:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.454 15:43:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:21.454 15:43:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.454 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.454 15:43:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:21.454 15:43:41 -- common/autotest_common.sh@10 -- # set +x 00:07:21.715 [2024-07-12 15:43:41.902762] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:07:21.715 [2024-07-12 15:43:41.902822] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2447508 ] 00:07:21.715 [2024-07-12 15:43:41.987552] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.715 [2024-07-12 15:43:42.080877] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.658 15:43:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:22.658 15:43:42 -- common/autotest_common.sh@862 -- # return 0 00:07:22.658 15:43:42 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:07:22.658 15:43:42 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:07:22.658 15:43:42 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:65:00.0 00:07:25.956 nvme0n1 00:07:25.956 15:43:45 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:07:25.956 [2024-07-12 15:43:45.985523] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:07:25.956 request: 00:07:25.956 { 00:07:25.956 "nvme_ctrlr_name": "nvme0", 00:07:25.956 "password": "test", 00:07:25.956 "method": "bdev_nvme_opal_revert", 00:07:25.956 "req_id": 1 00:07:25.956 } 00:07:25.956 Got JSON-RPC error response 00:07:25.956 response: 00:07:25.956 { 00:07:25.956 "code": -32602, 00:07:25.956 "message": "Invalid parameters" 00:07:25.956 } 00:07:25.956 15:43:46 -- common/autotest_common.sh@1604 -- # true 00:07:25.956 15:43:46 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:07:25.956 15:43:46 -- common/autotest_common.sh@1608 -- # killprocess 2447508 00:07:25.956 15:43:46 -- common/autotest_common.sh@948 -- # '[' -z 2447508 ']' 00:07:25.956 15:43:46 -- common/autotest_common.sh@952 -- # kill -0 2447508 00:07:25.956 15:43:46 -- common/autotest_common.sh@953 -- # uname 00:07:25.956 15:43:46 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:25.956 15:43:46 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2447508 00:07:25.956 15:43:46 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:25.956 15:43:46 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:25.956 15:43:46 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2447508' 00:07:25.956 killing process with pid 2447508 00:07:25.956 15:43:46 -- common/autotest_common.sh@967 -- # kill 2447508 00:07:25.956 15:43:46 -- common/autotest_common.sh@972 -- # wait 2447508 00:07:28.495 15:43:48 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:07:28.495 15:43:48 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:07:28.495 15:43:48 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:07:28.495 15:43:48 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:07:28.495 15:43:48 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:07:28.756 Restarting all devices. 00:07:33.060 lstat() error: No such file or directory 00:07:33.060 QAT Error: No GENERAL section found 00:07:33.060 Failed to configure qat_dev0 00:07:33.060 lstat() error: No such file or directory 00:07:33.060 QAT Error: No GENERAL section found 00:07:33.060 Failed to configure qat_dev1 00:07:33.060 lstat() error: No such file or directory 00:07:33.060 QAT Error: No GENERAL section found 00:07:33.060 Failed to configure qat_dev2 00:07:33.060 enable sriov 00:07:33.060 Checking status of all devices. 00:07:33.060 There is 3 QAT acceleration device(s) in the system: 00:07:33.060 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:cc:00.0, #accel: 5 #engines: 10 state: down 00:07:33.060 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:ce:00.0, #accel: 5 #engines: 10 state: down 00:07:33.060 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:d0:00.0, #accel: 5 #engines: 10 state: down 00:07:33.060 0000:cc:00.0 set to 16 VFs 00:07:33.320 0000:ce:00.0 set to 16 VFs 00:07:33.890 0000:d0:00.0 set to 16 VFs 00:07:33.890 Properly configured the qat device with driver uio_pci_generic. 00:07:33.890 15:43:54 -- spdk/autotest.sh@162 -- # timing_enter lib 00:07:33.890 15:43:54 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:33.890 15:43:54 -- common/autotest_common.sh@10 -- # set +x 00:07:33.890 15:43:54 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:07:33.890 15:43:54 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:33.890 15:43:54 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:33.890 15:43:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:33.890 15:43:54 -- common/autotest_common.sh@10 -- # set +x 00:07:34.150 ************************************ 00:07:34.150 START TEST env 00:07:34.150 ************************************ 00:07:34.150 15:43:54 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:07:34.150 * Looking for test storage... 00:07:34.150 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:07:34.150 15:43:54 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:34.150 15:43:54 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.150 15:43:54 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.150 15:43:54 env -- common/autotest_common.sh@10 -- # set +x 00:07:34.150 ************************************ 00:07:34.150 START TEST env_memory 00:07:34.150 ************************************ 00:07:34.150 15:43:54 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:07:34.150 00:07:34.150 00:07:34.150 CUnit - A unit testing framework for C - Version 2.1-3 00:07:34.150 http://cunit.sourceforge.net/ 00:07:34.150 00:07:34.150 00:07:34.150 Suite: memory 00:07:34.150 Test: alloc and free memory map ...[2024-07-12 15:43:54.550699] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:07:34.150 passed 00:07:34.150 Test: mem map translation ...[2024-07-12 15:43:54.574417] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:07:34.150 [2024-07-12 15:43:54.574445] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:07:34.151 [2024-07-12 15:43:54.574491] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:07:34.151 [2024-07-12 15:43:54.574499] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:07:34.413 passed 00:07:34.413 Test: mem map registration ...[2024-07-12 15:43:54.625508] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:07:34.413 [2024-07-12 15:43:54.625529] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:07:34.413 passed 00:07:34.413 Test: mem map adjacent registrations ...passed 00:07:34.413 00:07:34.413 Run Summary: Type Total Ran Passed Failed Inactive 00:07:34.413 suites 1 1 n/a 0 0 00:07:34.414 tests 4 4 4 0 0 00:07:34.414 asserts 152 152 152 0 n/a 00:07:34.414 00:07:34.414 Elapsed time = 0.180 seconds 00:07:34.414 00:07:34.414 real 0m0.194s 00:07:34.414 user 0m0.181s 00:07:34.414 sys 0m0.012s 00:07:34.414 15:43:54 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:34.414 15:43:54 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:07:34.414 ************************************ 00:07:34.414 END TEST env_memory 00:07:34.414 ************************************ 00:07:34.414 15:43:54 env -- common/autotest_common.sh@1142 -- # return 0 00:07:34.414 15:43:54 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:34.414 15:43:54 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:34.414 15:43:54 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:34.414 15:43:54 env -- common/autotest_common.sh@10 -- # set +x 00:07:34.414 ************************************ 00:07:34.414 START TEST env_vtophys 00:07:34.414 ************************************ 00:07:34.414 15:43:54 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:07:34.414 EAL: lib.eal log level changed from notice to debug 00:07:34.414 EAL: Detected lcore 0 as core 0 on socket 0 00:07:34.414 EAL: Detected lcore 1 as core 1 on socket 0 00:07:34.414 EAL: Detected lcore 2 as core 2 on socket 0 00:07:34.414 EAL: Detected lcore 3 as core 3 on socket 0 00:07:34.414 EAL: Detected lcore 4 as core 4 on socket 0 00:07:34.414 EAL: Detected lcore 5 as core 5 on socket 0 00:07:34.414 EAL: Detected lcore 6 as core 6 on socket 0 00:07:34.414 EAL: Detected lcore 7 as core 7 on socket 0 00:07:34.414 EAL: Detected lcore 8 as core 8 on socket 0 00:07:34.414 EAL: Detected lcore 9 as core 9 on socket 0 00:07:34.414 EAL: Detected lcore 10 as core 10 on socket 0 00:07:34.414 EAL: Detected lcore 11 as core 11 on socket 0 00:07:34.414 EAL: Detected lcore 12 as core 12 on socket 0 00:07:34.414 EAL: Detected lcore 13 as core 13 on socket 0 00:07:34.414 EAL: Detected lcore 14 as core 14 on socket 0 00:07:34.414 EAL: Detected lcore 15 as core 15 on socket 0 00:07:34.414 EAL: Detected lcore 16 as core 16 on socket 0 00:07:34.414 EAL: Detected lcore 17 as core 17 on socket 0 00:07:34.414 EAL: Detected lcore 18 as core 18 on socket 0 00:07:34.414 EAL: Detected lcore 19 as core 19 on socket 0 00:07:34.414 EAL: Detected lcore 20 as core 20 on socket 0 00:07:34.414 EAL: Detected lcore 21 as core 21 on socket 0 00:07:34.414 EAL: Detected lcore 22 as core 22 on socket 0 00:07:34.414 EAL: Detected lcore 23 as core 23 on socket 0 00:07:34.414 EAL: Detected lcore 24 as core 24 on socket 0 00:07:34.414 EAL: Detected lcore 25 as core 25 on socket 0 00:07:34.414 EAL: Detected lcore 26 as core 26 on socket 0 00:07:34.414 EAL: Detected lcore 27 as core 27 on socket 0 00:07:34.414 EAL: Detected lcore 28 as core 28 on socket 0 00:07:34.414 EAL: Detected lcore 29 as core 29 on socket 0 00:07:34.414 EAL: Detected lcore 30 as core 30 on socket 0 00:07:34.414 EAL: Detected lcore 31 as core 31 on socket 0 00:07:34.414 EAL: Detected lcore 32 as core 0 on socket 1 00:07:34.414 EAL: Detected lcore 33 as core 1 on socket 1 00:07:34.414 EAL: Detected lcore 34 as core 2 on socket 1 00:07:34.414 EAL: Detected lcore 35 as core 3 on socket 1 00:07:34.414 EAL: Detected lcore 36 as core 4 on socket 1 00:07:34.414 EAL: Detected lcore 37 as core 5 on socket 1 00:07:34.414 EAL: Detected lcore 38 as core 6 on socket 1 00:07:34.414 EAL: Detected lcore 39 as core 7 on socket 1 00:07:34.414 EAL: Detected lcore 40 as core 8 on socket 1 00:07:34.414 EAL: Detected lcore 41 as core 9 on socket 1 00:07:34.414 EAL: Detected lcore 42 as core 10 on socket 1 00:07:34.414 EAL: Detected lcore 43 as core 11 on socket 1 00:07:34.414 EAL: Detected lcore 44 as core 12 on socket 1 00:07:34.414 EAL: Detected lcore 45 as core 13 on socket 1 00:07:34.414 EAL: Detected lcore 46 as core 14 on socket 1 00:07:34.414 EAL: Detected lcore 47 as core 15 on socket 1 00:07:34.414 EAL: Detected lcore 48 as core 16 on socket 1 00:07:34.414 EAL: Detected lcore 49 as core 17 on socket 1 00:07:34.414 EAL: Detected lcore 50 as core 18 on socket 1 00:07:34.414 EAL: Detected lcore 51 as core 19 on socket 1 00:07:34.414 EAL: Detected lcore 52 as core 20 on socket 1 00:07:34.414 EAL: Detected lcore 53 as core 21 on socket 1 00:07:34.414 EAL: Detected lcore 54 as core 22 on socket 1 00:07:34.414 EAL: Detected lcore 55 as core 23 on socket 1 00:07:34.414 EAL: Detected lcore 56 as core 24 on socket 1 00:07:34.414 EAL: Detected lcore 57 as core 25 on socket 1 00:07:34.414 EAL: Detected lcore 58 as core 26 on socket 1 00:07:34.414 EAL: Detected lcore 59 as core 27 on socket 1 00:07:34.414 EAL: Detected lcore 60 as core 28 on socket 1 00:07:34.414 EAL: Detected lcore 61 as core 29 on socket 1 00:07:34.414 EAL: Detected lcore 62 as core 30 on socket 1 00:07:34.414 EAL: Detected lcore 63 as core 31 on socket 1 00:07:34.414 EAL: Detected lcore 64 as core 0 on socket 0 00:07:34.414 EAL: Detected lcore 65 as core 1 on socket 0 00:07:34.414 EAL: Detected lcore 66 as core 2 on socket 0 00:07:34.414 EAL: Detected lcore 67 as core 3 on socket 0 00:07:34.414 EAL: Detected lcore 68 as core 4 on socket 0 00:07:34.414 EAL: Detected lcore 69 as core 5 on socket 0 00:07:34.414 EAL: Detected lcore 70 as core 6 on socket 0 00:07:34.414 EAL: Detected lcore 71 as core 7 on socket 0 00:07:34.414 EAL: Detected lcore 72 as core 8 on socket 0 00:07:34.414 EAL: Detected lcore 73 as core 9 on socket 0 00:07:34.414 EAL: Detected lcore 74 as core 10 on socket 0 00:07:34.414 EAL: Detected lcore 75 as core 11 on socket 0 00:07:34.414 EAL: Detected lcore 76 as core 12 on socket 0 00:07:34.414 EAL: Detected lcore 77 as core 13 on socket 0 00:07:34.414 EAL: Detected lcore 78 as core 14 on socket 0 00:07:34.414 EAL: Detected lcore 79 as core 15 on socket 0 00:07:34.414 EAL: Detected lcore 80 as core 16 on socket 0 00:07:34.414 EAL: Detected lcore 81 as core 17 on socket 0 00:07:34.414 EAL: Detected lcore 82 as core 18 on socket 0 00:07:34.414 EAL: Detected lcore 83 as core 19 on socket 0 00:07:34.414 EAL: Detected lcore 84 as core 20 on socket 0 00:07:34.414 EAL: Detected lcore 85 as core 21 on socket 0 00:07:34.414 EAL: Detected lcore 86 as core 22 on socket 0 00:07:34.414 EAL: Detected lcore 87 as core 23 on socket 0 00:07:34.414 EAL: Detected lcore 88 as core 24 on socket 0 00:07:34.414 EAL: Detected lcore 89 as core 25 on socket 0 00:07:34.414 EAL: Detected lcore 90 as core 26 on socket 0 00:07:34.414 EAL: Detected lcore 91 as core 27 on socket 0 00:07:34.414 EAL: Detected lcore 92 as core 28 on socket 0 00:07:34.414 EAL: Detected lcore 93 as core 29 on socket 0 00:07:34.414 EAL: Detected lcore 94 as core 30 on socket 0 00:07:34.414 EAL: Detected lcore 95 as core 31 on socket 0 00:07:34.414 EAL: Detected lcore 96 as core 0 on socket 1 00:07:34.414 EAL: Detected lcore 97 as core 1 on socket 1 00:07:34.414 EAL: Detected lcore 98 as core 2 on socket 1 00:07:34.414 EAL: Detected lcore 99 as core 3 on socket 1 00:07:34.414 EAL: Detected lcore 100 as core 4 on socket 1 00:07:34.414 EAL: Detected lcore 101 as core 5 on socket 1 00:07:34.414 EAL: Detected lcore 102 as core 6 on socket 1 00:07:34.414 EAL: Detected lcore 103 as core 7 on socket 1 00:07:34.414 EAL: Detected lcore 104 as core 8 on socket 1 00:07:34.414 EAL: Detected lcore 105 as core 9 on socket 1 00:07:34.414 EAL: Detected lcore 106 as core 10 on socket 1 00:07:34.414 EAL: Detected lcore 107 as core 11 on socket 1 00:07:34.414 EAL: Detected lcore 108 as core 12 on socket 1 00:07:34.414 EAL: Detected lcore 109 as core 13 on socket 1 00:07:34.414 EAL: Detected lcore 110 as core 14 on socket 1 00:07:34.414 EAL: Detected lcore 111 as core 15 on socket 1 00:07:34.414 EAL: Detected lcore 112 as core 16 on socket 1 00:07:34.414 EAL: Detected lcore 113 as core 17 on socket 1 00:07:34.414 EAL: Detected lcore 114 as core 18 on socket 1 00:07:34.414 EAL: Detected lcore 115 as core 19 on socket 1 00:07:34.414 EAL: Detected lcore 116 as core 20 on socket 1 00:07:34.414 EAL: Detected lcore 117 as core 21 on socket 1 00:07:34.414 EAL: Detected lcore 118 as core 22 on socket 1 00:07:34.414 EAL: Detected lcore 119 as core 23 on socket 1 00:07:34.414 EAL: Detected lcore 120 as core 24 on socket 1 00:07:34.414 EAL: Detected lcore 121 as core 25 on socket 1 00:07:34.414 EAL: Detected lcore 122 as core 26 on socket 1 00:07:34.414 EAL: Detected lcore 123 as core 27 on socket 1 00:07:34.414 EAL: Detected lcore 124 as core 28 on socket 1 00:07:34.414 EAL: Detected lcore 125 as core 29 on socket 1 00:07:34.414 EAL: Detected lcore 126 as core 30 on socket 1 00:07:34.414 EAL: Detected lcore 127 as core 31 on socket 1 00:07:34.414 EAL: Maximum logical cores by configuration: 128 00:07:34.414 EAL: Detected CPU lcores: 128 00:07:34.414 EAL: Detected NUMA nodes: 2 00:07:34.414 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:07:34.414 EAL: Detected shared linkage of DPDK 00:07:34.414 EAL: No shared files mode enabled, IPC will be disabled 00:07:34.414 EAL: No shared files mode enabled, IPC is disabled 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.0 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.1 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.2 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.3 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.4 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.5 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.6 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:01.7 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.0 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.1 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.2 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.3 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.4 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.5 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.6 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:cc:02.7 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.0 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.1 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.2 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.3 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.4 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.5 wants IOVA as 'PA' 00:07:34.414 EAL: PCI driver qat for device 0000:ce:01.6 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:01.7 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.0 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.1 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.2 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.3 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.4 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.5 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.6 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:ce:02.7 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.0 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.1 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.2 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.3 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.4 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.5 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.6 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:01.7 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.0 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.1 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.2 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.3 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.4 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.5 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.6 wants IOVA as 'PA' 00:07:34.415 EAL: PCI driver qat for device 0000:d0:02.7 wants IOVA as 'PA' 00:07:34.415 EAL: Bus pci wants IOVA as 'PA' 00:07:34.415 EAL: Bus auxiliary wants IOVA as 'DC' 00:07:34.415 EAL: Bus vdev wants IOVA as 'DC' 00:07:34.415 EAL: Selected IOVA mode 'PA' 00:07:34.415 EAL: Probing VFIO support... 00:07:34.415 EAL: IOMMU type 1 (Type 1) is supported 00:07:34.415 EAL: IOMMU type 7 (sPAPR) is not supported 00:07:34.415 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:07:34.415 EAL: VFIO support initialized 00:07:34.415 EAL: Ask a virtual area of 0x2e000 bytes 00:07:34.415 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:07:34.415 EAL: Setting up physically contiguous memory... 00:07:34.415 EAL: Setting maximum number of open files to 524288 00:07:34.415 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:07:34.415 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:07:34.415 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:07:34.415 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:07:34.415 EAL: Ask a virtual area of 0x61000 bytes 00:07:34.415 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:07:34.415 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:07:34.415 EAL: Ask a virtual area of 0x400000000 bytes 00:07:34.415 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:07:34.415 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:07:34.415 EAL: Hugepages will be freed exactly as allocated. 00:07:34.415 EAL: No shared files mode enabled, IPC is disabled 00:07:34.415 EAL: No shared files mode enabled, IPC is disabled 00:07:34.415 EAL: TSC frequency is ~2600000 KHz 00:07:34.415 EAL: Main lcore 0 is ready (tid=7fa26eac9b00;cpuset=[0]) 00:07:34.415 EAL: Trying to obtain current memory policy. 00:07:34.415 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.415 EAL: Restoring previous memory policy: 0 00:07:34.415 EAL: request: mp_malloc_sync 00:07:34.415 EAL: No shared files mode enabled, IPC is disabled 00:07:34.415 EAL: Heap on socket 0 was expanded by 2MB 00:07:34.415 EAL: PCI device 0000:cc:01.0 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001000000 00:07:34.415 EAL: PCI memory mapped at 0x202001001000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:07:34.415 EAL: Trying to obtain current memory policy. 00:07:34.415 EAL: Setting policy MPOL_PREFERRED for socket 1 00:07:34.415 EAL: Restoring previous memory policy: 4 00:07:34.415 EAL: request: mp_malloc_sync 00:07:34.415 EAL: No shared files mode enabled, IPC is disabled 00:07:34.415 EAL: Heap on socket 1 was expanded by 2MB 00:07:34.415 EAL: PCI device 0000:cc:01.1 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001002000 00:07:34.415 EAL: PCI memory mapped at 0x202001003000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.2 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001004000 00:07:34.415 EAL: PCI memory mapped at 0x202001005000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.3 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001006000 00:07:34.415 EAL: PCI memory mapped at 0x202001007000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.4 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001008000 00:07:34.415 EAL: PCI memory mapped at 0x202001009000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.5 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x20200100a000 00:07:34.415 EAL: PCI memory mapped at 0x20200100b000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.6 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x20200100c000 00:07:34.415 EAL: PCI memory mapped at 0x20200100d000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:01.7 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x20200100e000 00:07:34.415 EAL: PCI memory mapped at 0x20200100f000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:02.0 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001010000 00:07:34.415 EAL: PCI memory mapped at 0x202001011000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:02.1 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001012000 00:07:34.415 EAL: PCI memory mapped at 0x202001013000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:02.2 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001014000 00:07:34.415 EAL: PCI memory mapped at 0x202001015000 00:07:34.415 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:07:34.415 EAL: PCI device 0000:cc:02.3 on NUMA socket 1 00:07:34.415 EAL: probe driver: 8086:37c9 qat 00:07:34.415 EAL: PCI memory mapped at 0x202001016000 00:07:34.416 EAL: PCI memory mapped at 0x202001017000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:07:34.416 EAL: PCI device 0000:cc:02.4 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001018000 00:07:34.416 EAL: PCI memory mapped at 0x202001019000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:07:34.416 EAL: PCI device 0000:cc:02.5 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200101a000 00:07:34.416 EAL: PCI memory mapped at 0x20200101b000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:07:34.416 EAL: PCI device 0000:cc:02.6 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200101c000 00:07:34.416 EAL: PCI memory mapped at 0x20200101d000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:07:34.416 EAL: PCI device 0000:cc:02.7 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200101e000 00:07:34.416 EAL: PCI memory mapped at 0x20200101f000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.0 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001020000 00:07:34.416 EAL: PCI memory mapped at 0x202001021000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.1 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001022000 00:07:34.416 EAL: PCI memory mapped at 0x202001023000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.2 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001024000 00:07:34.416 EAL: PCI memory mapped at 0x202001025000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.3 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001026000 00:07:34.416 EAL: PCI memory mapped at 0x202001027000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.4 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001028000 00:07:34.416 EAL: PCI memory mapped at 0x202001029000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.5 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200102a000 00:07:34.416 EAL: PCI memory mapped at 0x20200102b000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.6 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200102c000 00:07:34.416 EAL: PCI memory mapped at 0x20200102d000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:01.7 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200102e000 00:07:34.416 EAL: PCI memory mapped at 0x20200102f000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.0 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001030000 00:07:34.416 EAL: PCI memory mapped at 0x202001031000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.1 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001032000 00:07:34.416 EAL: PCI memory mapped at 0x202001033000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.2 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001034000 00:07:34.416 EAL: PCI memory mapped at 0x202001035000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.3 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001036000 00:07:34.416 EAL: PCI memory mapped at 0x202001037000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.4 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001038000 00:07:34.416 EAL: PCI memory mapped at 0x202001039000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.5 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200103a000 00:07:34.416 EAL: PCI memory mapped at 0x20200103b000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.6 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200103c000 00:07:34.416 EAL: PCI memory mapped at 0x20200103d000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:07:34.416 EAL: PCI device 0000:ce:02.7 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200103e000 00:07:34.416 EAL: PCI memory mapped at 0x20200103f000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.0 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001040000 00:07:34.416 EAL: PCI memory mapped at 0x202001041000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.1 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001042000 00:07:34.416 EAL: PCI memory mapped at 0x202001043000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.2 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001044000 00:07:34.416 EAL: PCI memory mapped at 0x202001045000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.3 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001046000 00:07:34.416 EAL: PCI memory mapped at 0x202001047000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.4 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001048000 00:07:34.416 EAL: PCI memory mapped at 0x202001049000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.5 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200104a000 00:07:34.416 EAL: PCI memory mapped at 0x20200104b000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.6 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200104c000 00:07:34.416 EAL: PCI memory mapped at 0x20200104d000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:01.7 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200104e000 00:07:34.416 EAL: PCI memory mapped at 0x20200104f000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.0 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001050000 00:07:34.416 EAL: PCI memory mapped at 0x202001051000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.1 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001052000 00:07:34.416 EAL: PCI memory mapped at 0x202001053000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.2 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001054000 00:07:34.416 EAL: PCI memory mapped at 0x202001055000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.3 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001056000 00:07:34.416 EAL: PCI memory mapped at 0x202001057000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.4 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x202001058000 00:07:34.416 EAL: PCI memory mapped at 0x202001059000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.5 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200105a000 00:07:34.416 EAL: PCI memory mapped at 0x20200105b000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.6 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200105c000 00:07:34.416 EAL: PCI memory mapped at 0x20200105d000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:07:34.416 EAL: PCI device 0000:d0:02.7 on NUMA socket 1 00:07:34.416 EAL: probe driver: 8086:37c9 qat 00:07:34.416 EAL: PCI memory mapped at 0x20200105e000 00:07:34.416 EAL: PCI memory mapped at 0x20200105f000 00:07:34.416 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:07:34.416 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: No PCI address specified using 'addr=' in: bus=pci 00:07:34.677 EAL: Mem event callback 'spdk:(nil)' registered 00:07:34.677 00:07:34.677 00:07:34.677 CUnit - A unit testing framework for C - Version 2.1-3 00:07:34.677 http://cunit.sourceforge.net/ 00:07:34.677 00:07:34.677 00:07:34.677 Suite: components_suite 00:07:34.677 Test: vtophys_malloc_test ...passed 00:07:34.677 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 4MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 4MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 6MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 6MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 10MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 10MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 18MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 18MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 34MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 34MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 66MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 66MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 130MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 130MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.677 EAL: Restoring previous memory policy: 4 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was expanded by 258MB 00:07:34.677 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.677 EAL: request: mp_malloc_sync 00:07:34.677 EAL: No shared files mode enabled, IPC is disabled 00:07:34.677 EAL: Heap on socket 0 was shrunk by 258MB 00:07:34.677 EAL: Trying to obtain current memory policy. 00:07:34.677 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:34.936 EAL: Restoring previous memory policy: 4 00:07:34.936 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.936 EAL: request: mp_malloc_sync 00:07:34.936 EAL: No shared files mode enabled, IPC is disabled 00:07:34.936 EAL: Heap on socket 0 was expanded by 514MB 00:07:34.936 EAL: Calling mem event callback 'spdk:(nil)' 00:07:34.936 EAL: request: mp_malloc_sync 00:07:34.936 EAL: No shared files mode enabled, IPC is disabled 00:07:34.936 EAL: Heap on socket 0 was shrunk by 514MB 00:07:34.936 EAL: Trying to obtain current memory policy. 00:07:34.936 EAL: Setting policy MPOL_PREFERRED for socket 0 00:07:35.197 EAL: Restoring previous memory policy: 4 00:07:35.197 EAL: Calling mem event callback 'spdk:(nil)' 00:07:35.197 EAL: request: mp_malloc_sync 00:07:35.197 EAL: No shared files mode enabled, IPC is disabled 00:07:35.197 EAL: Heap on socket 0 was expanded by 1026MB 00:07:35.197 EAL: Calling mem event callback 'spdk:(nil)' 00:07:35.197 EAL: request: mp_malloc_sync 00:07:35.197 EAL: No shared files mode enabled, IPC is disabled 00:07:35.197 EAL: Heap on socket 0 was shrunk by 1026MB 00:07:35.197 passed 00:07:35.197 00:07:35.197 Run Summary: Type Total Ran Passed Failed Inactive 00:07:35.197 suites 1 1 n/a 0 0 00:07:35.197 tests 2 2 2 0 0 00:07:35.197 asserts 6751 6751 6751 0 n/a 00:07:35.197 00:07:35.197 Elapsed time = 0.682 seconds 00:07:35.197 EAL: No shared files mode enabled, IPC is disabled 00:07:35.197 EAL: No shared files mode enabled, IPC is disabled 00:07:35.197 EAL: No shared files mode enabled, IPC is disabled 00:07:35.197 00:07:35.197 real 0m0.847s 00:07:35.197 user 0m0.425s 00:07:35.197 sys 0m0.387s 00:07:35.197 15:43:55 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.197 15:43:55 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:07:35.197 ************************************ 00:07:35.197 END TEST env_vtophys 00:07:35.197 ************************************ 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1142 -- # return 0 00:07:35.458 15:43:55 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.458 15:43:55 env -- common/autotest_common.sh@10 -- # set +x 00:07:35.458 ************************************ 00:07:35.458 START TEST env_pci 00:07:35.458 ************************************ 00:07:35.458 15:43:55 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:07:35.458 00:07:35.458 00:07:35.458 CUnit - A unit testing framework for C - Version 2.1-3 00:07:35.458 http://cunit.sourceforge.net/ 00:07:35.458 00:07:35.458 00:07:35.458 Suite: pci 00:07:35.458 Test: pci_hook ...[2024-07-12 15:43:55.719178] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2450139 has claimed it 00:07:35.458 EAL: Cannot find device (10000:00:01.0) 00:07:35.458 EAL: Failed to attach device on primary process 00:07:35.458 passed 00:07:35.458 00:07:35.458 Run Summary: Type Total Ran Passed Failed Inactive 00:07:35.458 suites 1 1 n/a 0 0 00:07:35.458 tests 1 1 1 0 0 00:07:35.458 asserts 25 25 25 0 n/a 00:07:35.458 00:07:35.458 Elapsed time = 0.033 seconds 00:07:35.458 00:07:35.458 real 0m0.061s 00:07:35.458 user 0m0.021s 00:07:35.458 sys 0m0.040s 00:07:35.458 15:43:55 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:35.458 15:43:55 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:07:35.458 ************************************ 00:07:35.458 END TEST env_pci 00:07:35.458 ************************************ 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1142 -- # return 0 00:07:35.458 15:43:55 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:07:35.458 15:43:55 env -- env/env.sh@15 -- # uname 00:07:35.458 15:43:55 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:07:35.458 15:43:55 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:07:35.458 15:43:55 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:07:35.458 15:43:55 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:35.458 15:43:55 env -- common/autotest_common.sh@10 -- # set +x 00:07:35.458 ************************************ 00:07:35.458 START TEST env_dpdk_post_init 00:07:35.458 ************************************ 00:07:35.458 15:43:55 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:07:35.458 EAL: Detected CPU lcores: 128 00:07:35.458 EAL: Detected NUMA nodes: 2 00:07:35.458 EAL: Detected shared linkage of DPDK 00:07:35.458 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:35.458 EAL: Selected IOVA mode 'PA' 00:07:35.458 EAL: VFIO support initialized 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.720 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:07:35.720 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.720 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:07:35.721 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.721 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:07:35.721 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.722 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.722 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:07:35.722 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.722 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.722 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:07:35.722 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:35.722 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:07:35.722 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:35.722 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:35.722 EAL: Using IOMMU type 1 (Type 1) 00:07:35.722 EAL: Ignore mapping IO port bar(1) 00:07:35.982 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:07:35.982 EAL: Ignore mapping IO port bar(1) 00:07:36.242 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:07:36.242 EAL: Ignore mapping IO port bar(1) 00:07:36.243 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:07:36.502 EAL: Ignore mapping IO port bar(1) 00:07:36.502 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:07:36.762 EAL: Ignore mapping IO port bar(1) 00:07:36.762 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:07:37.022 EAL: Ignore mapping IO port bar(1) 00:07:37.022 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:07:37.022 EAL: Ignore mapping IO port bar(1) 00:07:37.282 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:07:37.282 EAL: Ignore mapping IO port bar(1) 00:07:37.542 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:07:38.113 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:65:00.0 (socket 0) 00:07:38.374 EAL: Ignore mapping IO port bar(1) 00:07:38.374 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:07:38.638 EAL: Ignore mapping IO port bar(1) 00:07:38.638 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:07:38.899 EAL: Ignore mapping IO port bar(1) 00:07:38.899 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:07:38.899 EAL: Ignore mapping IO port bar(1) 00:07:39.160 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:07:39.160 EAL: Ignore mapping IO port bar(1) 00:07:39.421 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:07:39.421 EAL: Ignore mapping IO port bar(1) 00:07:39.421 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:07:39.681 EAL: Ignore mapping IO port bar(1) 00:07:39.681 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:07:39.942 EAL: Ignore mapping IO port bar(1) 00:07:39.942 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:07:44.145 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:07:44.145 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:07:44.145 Starting DPDK initialization... 00:07:44.145 Starting SPDK post initialization... 00:07:44.145 SPDK NVMe probe 00:07:44.145 Attaching to 0000:65:00.0 00:07:44.145 Attached to 0000:65:00.0 00:07:44.145 Cleaning up... 00:07:46.054 00:07:46.054 real 0m10.405s 00:07:46.054 user 0m4.246s 00:07:46.054 sys 0m0.183s 00:07:46.054 15:44:06 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.054 15:44:06 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:07:46.054 ************************************ 00:07:46.054 END TEST env_dpdk_post_init 00:07:46.054 ************************************ 00:07:46.054 15:44:06 env -- common/autotest_common.sh@1142 -- # return 0 00:07:46.054 15:44:06 env -- env/env.sh@26 -- # uname 00:07:46.054 15:44:06 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:07:46.054 15:44:06 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:46.054 15:44:06 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:46.054 15:44:06 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.054 15:44:06 env -- common/autotest_common.sh@10 -- # set +x 00:07:46.054 ************************************ 00:07:46.054 START TEST env_mem_callbacks 00:07:46.054 ************************************ 00:07:46.054 15:44:06 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:07:46.054 EAL: Detected CPU lcores: 128 00:07:46.054 EAL: Detected NUMA nodes: 2 00:07:46.054 EAL: Detected shared linkage of DPDK 00:07:46.054 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:07:46.054 EAL: Selected IOVA mode 'PA' 00:07:46.054 EAL: VFIO support initialized 00:07:46.054 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.054 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.054 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.054 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:07:46.054 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.054 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.055 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:07:46.055 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.055 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:07:46.056 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:07:46.056 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:07:46.056 TELEMETRY: No legacy callbacks, legacy socket not created 00:07:46.056 00:07:46.056 00:07:46.056 CUnit - A unit testing framework for C - Version 2.1-3 00:07:46.056 http://cunit.sourceforge.net/ 00:07:46.056 00:07:46.056 00:07:46.056 Suite: memory 00:07:46.056 Test: test ... 00:07:46.056 register 0x200000200000 2097152 00:07:46.056 register 0x201000a00000 2097152 00:07:46.056 malloc 3145728 00:07:46.056 register 0x200000400000 4194304 00:07:46.056 buf 0x200000500000 len 3145728 PASSED 00:07:46.056 malloc 64 00:07:46.056 buf 0x2000004fff40 len 64 PASSED 00:07:46.056 malloc 4194304 00:07:46.056 register 0x200000800000 6291456 00:07:46.056 buf 0x200000a00000 len 4194304 PASSED 00:07:46.056 free 0x200000500000 3145728 00:07:46.056 free 0x2000004fff40 64 00:07:46.056 unregister 0x200000400000 4194304 PASSED 00:07:46.056 free 0x200000a00000 4194304 00:07:46.056 unregister 0x200000800000 6291456 PASSED 00:07:46.056 malloc 8388608 00:07:46.056 register 0x200000400000 10485760 00:07:46.056 buf 0x200000600000 len 8388608 PASSED 00:07:46.056 free 0x200000600000 8388608 00:07:46.056 unregister 0x200000400000 10485760 PASSED 00:07:46.056 passed 00:07:46.056 00:07:46.056 Run Summary: Type Total Ran Passed Failed Inactive 00:07:46.056 suites 1 1 n/a 0 0 00:07:46.056 tests 1 1 1 0 0 00:07:46.056 asserts 16 16 16 0 n/a 00:07:46.056 00:07:46.056 Elapsed time = 0.007 seconds 00:07:46.056 00:07:46.056 real 0m0.085s 00:07:46.056 user 0m0.030s 00:07:46.056 sys 0m0.055s 00:07:46.056 15:44:06 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.056 15:44:06 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:07:46.056 ************************************ 00:07:46.056 END TEST env_mem_callbacks 00:07:46.056 ************************************ 00:07:46.056 15:44:06 env -- common/autotest_common.sh@1142 -- # return 0 00:07:46.056 00:07:46.056 real 0m12.090s 00:07:46.056 user 0m5.088s 00:07:46.056 sys 0m1.017s 00:07:46.056 15:44:06 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:46.056 15:44:06 env -- common/autotest_common.sh@10 -- # set +x 00:07:46.056 ************************************ 00:07:46.056 END TEST env 00:07:46.056 ************************************ 00:07:46.056 15:44:06 -- common/autotest_common.sh@1142 -- # return 0 00:07:46.056 15:44:06 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:46.056 15:44:06 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:46.056 15:44:06 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:46.056 15:44:06 -- common/autotest_common.sh@10 -- # set +x 00:07:46.317 ************************************ 00:07:46.318 START TEST rpc 00:07:46.318 ************************************ 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:07:46.318 * Looking for test storage... 00:07:46.318 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:46.318 15:44:06 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2452080 00:07:46.318 15:44:06 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:46.318 15:44:06 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:07:46.318 15:44:06 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2452080 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@829 -- # '[' -z 2452080 ']' 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:46.318 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:46.318 15:44:06 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:46.318 [2024-07-12 15:44:06.701732] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:07:46.318 [2024-07-12 15:44:06.701789] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2452080 ] 00:07:46.578 [2024-07-12 15:44:06.795346] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.578 [2024-07-12 15:44:06.863100] app.c: 607:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:07:46.578 [2024-07-12 15:44:06.863140] app.c: 608:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2452080' to capture a snapshot of events at runtime. 00:07:46.578 [2024-07-12 15:44:06.863148] app.c: 613:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:46.578 [2024-07-12 15:44:06.863154] app.c: 614:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:46.578 [2024-07-12 15:44:06.863161] app.c: 615:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2452080 for offline analysis/debug. 00:07:46.578 [2024-07-12 15:44:06.863182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.151 15:44:07 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:47.151 15:44:07 rpc -- common/autotest_common.sh@862 -- # return 0 00:07:47.151 15:44:07 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:47.151 15:44:07 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:47.151 15:44:07 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:07:47.151 15:44:07 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:07:47.151 15:44:07 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:47.151 15:44:07 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.151 15:44:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.151 ************************************ 00:07:47.151 START TEST rpc_integrity 00:07:47.151 ************************************ 00:07:47.151 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:47.151 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:47.151 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.151 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.151 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.151 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:47.151 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:47.412 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:47.412 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.412 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:07:47.412 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.412 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.412 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:47.412 { 00:07:47.412 "name": "Malloc0", 00:07:47.412 "aliases": [ 00:07:47.412 "793ba5da-cbc0-4d3d-91fd-5222ef86b10e" 00:07:47.412 ], 00:07:47.412 "product_name": "Malloc disk", 00:07:47.412 "block_size": 512, 00:07:47.412 "num_blocks": 16384, 00:07:47.412 "uuid": "793ba5da-cbc0-4d3d-91fd-5222ef86b10e", 00:07:47.412 "assigned_rate_limits": { 00:07:47.412 "rw_ios_per_sec": 0, 00:07:47.412 "rw_mbytes_per_sec": 0, 00:07:47.412 "r_mbytes_per_sec": 0, 00:07:47.412 "w_mbytes_per_sec": 0 00:07:47.412 }, 00:07:47.412 "claimed": false, 00:07:47.412 "zoned": false, 00:07:47.412 "supported_io_types": { 00:07:47.412 "read": true, 00:07:47.412 "write": true, 00:07:47.412 "unmap": true, 00:07:47.412 "flush": true, 00:07:47.412 "reset": true, 00:07:47.412 "nvme_admin": false, 00:07:47.412 "nvme_io": false, 00:07:47.412 "nvme_io_md": false, 00:07:47.412 "write_zeroes": true, 00:07:47.412 "zcopy": true, 00:07:47.412 "get_zone_info": false, 00:07:47.412 "zone_management": false, 00:07:47.412 "zone_append": false, 00:07:47.412 "compare": false, 00:07:47.412 "compare_and_write": false, 00:07:47.412 "abort": true, 00:07:47.412 "seek_hole": false, 00:07:47.412 "seek_data": false, 00:07:47.412 "copy": true, 00:07:47.412 "nvme_iov_md": false 00:07:47.412 }, 00:07:47.412 "memory_domains": [ 00:07:47.412 { 00:07:47.412 "dma_device_id": "system", 00:07:47.412 "dma_device_type": 1 00:07:47.412 }, 00:07:47.412 { 00:07:47.412 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:47.412 "dma_device_type": 2 00:07:47.413 } 00:07:47.413 ], 00:07:47.413 "driver_specific": {} 00:07:47.413 } 00:07:47.413 ]' 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.413 [2024-07-12 15:44:07.704854] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:07:47.413 [2024-07-12 15:44:07.704888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:47.413 [2024-07-12 15:44:07.704900] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa67ed0 00:07:47.413 [2024-07-12 15:44:07.704906] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:47.413 [2024-07-12 15:44:07.706182] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:47.413 [2024-07-12 15:44:07.706203] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:47.413 Passthru0 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:47.413 { 00:07:47.413 "name": "Malloc0", 00:07:47.413 "aliases": [ 00:07:47.413 "793ba5da-cbc0-4d3d-91fd-5222ef86b10e" 00:07:47.413 ], 00:07:47.413 "product_name": "Malloc disk", 00:07:47.413 "block_size": 512, 00:07:47.413 "num_blocks": 16384, 00:07:47.413 "uuid": "793ba5da-cbc0-4d3d-91fd-5222ef86b10e", 00:07:47.413 "assigned_rate_limits": { 00:07:47.413 "rw_ios_per_sec": 0, 00:07:47.413 "rw_mbytes_per_sec": 0, 00:07:47.413 "r_mbytes_per_sec": 0, 00:07:47.413 "w_mbytes_per_sec": 0 00:07:47.413 }, 00:07:47.413 "claimed": true, 00:07:47.413 "claim_type": "exclusive_write", 00:07:47.413 "zoned": false, 00:07:47.413 "supported_io_types": { 00:07:47.413 "read": true, 00:07:47.413 "write": true, 00:07:47.413 "unmap": true, 00:07:47.413 "flush": true, 00:07:47.413 "reset": true, 00:07:47.413 "nvme_admin": false, 00:07:47.413 "nvme_io": false, 00:07:47.413 "nvme_io_md": false, 00:07:47.413 "write_zeroes": true, 00:07:47.413 "zcopy": true, 00:07:47.413 "get_zone_info": false, 00:07:47.413 "zone_management": false, 00:07:47.413 "zone_append": false, 00:07:47.413 "compare": false, 00:07:47.413 "compare_and_write": false, 00:07:47.413 "abort": true, 00:07:47.413 "seek_hole": false, 00:07:47.413 "seek_data": false, 00:07:47.413 "copy": true, 00:07:47.413 "nvme_iov_md": false 00:07:47.413 }, 00:07:47.413 "memory_domains": [ 00:07:47.413 { 00:07:47.413 "dma_device_id": "system", 00:07:47.413 "dma_device_type": 1 00:07:47.413 }, 00:07:47.413 { 00:07:47.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:47.413 "dma_device_type": 2 00:07:47.413 } 00:07:47.413 ], 00:07:47.413 "driver_specific": {} 00:07:47.413 }, 00:07:47.413 { 00:07:47.413 "name": "Passthru0", 00:07:47.413 "aliases": [ 00:07:47.413 "848e9e83-9b96-58e0-a754-d790f5882422" 00:07:47.413 ], 00:07:47.413 "product_name": "passthru", 00:07:47.413 "block_size": 512, 00:07:47.413 "num_blocks": 16384, 00:07:47.413 "uuid": "848e9e83-9b96-58e0-a754-d790f5882422", 00:07:47.413 "assigned_rate_limits": { 00:07:47.413 "rw_ios_per_sec": 0, 00:07:47.413 "rw_mbytes_per_sec": 0, 00:07:47.413 "r_mbytes_per_sec": 0, 00:07:47.413 "w_mbytes_per_sec": 0 00:07:47.413 }, 00:07:47.413 "claimed": false, 00:07:47.413 "zoned": false, 00:07:47.413 "supported_io_types": { 00:07:47.413 "read": true, 00:07:47.413 "write": true, 00:07:47.413 "unmap": true, 00:07:47.413 "flush": true, 00:07:47.413 "reset": true, 00:07:47.413 "nvme_admin": false, 00:07:47.413 "nvme_io": false, 00:07:47.413 "nvme_io_md": false, 00:07:47.413 "write_zeroes": true, 00:07:47.413 "zcopy": true, 00:07:47.413 "get_zone_info": false, 00:07:47.413 "zone_management": false, 00:07:47.413 "zone_append": false, 00:07:47.413 "compare": false, 00:07:47.413 "compare_and_write": false, 00:07:47.413 "abort": true, 00:07:47.413 "seek_hole": false, 00:07:47.413 "seek_data": false, 00:07:47.413 "copy": true, 00:07:47.413 "nvme_iov_md": false 00:07:47.413 }, 00:07:47.413 "memory_domains": [ 00:07:47.413 { 00:07:47.413 "dma_device_id": "system", 00:07:47.413 "dma_device_type": 1 00:07:47.413 }, 00:07:47.413 { 00:07:47.413 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:47.413 "dma_device_type": 2 00:07:47.413 } 00:07:47.413 ], 00:07:47.413 "driver_specific": { 00:07:47.413 "passthru": { 00:07:47.413 "name": "Passthru0", 00:07:47.413 "base_bdev_name": "Malloc0" 00:07:47.413 } 00:07:47.413 } 00:07:47.413 } 00:07:47.413 ]' 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.413 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:47.413 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:47.674 15:44:07 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:47.674 00:07:47.674 real 0m0.299s 00:07:47.674 user 0m0.193s 00:07:47.674 sys 0m0.040s 00:07:47.674 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.674 15:44:07 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 ************************************ 00:07:47.674 END TEST rpc_integrity 00:07:47.674 ************************************ 00:07:47.674 15:44:07 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:47.674 15:44:07 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:07:47.674 15:44:07 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:47.674 15:44:07 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.674 15:44:07 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 ************************************ 00:07:47.674 START TEST rpc_plugins 00:07:47.674 ************************************ 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:07:47.674 15:44:07 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.674 15:44:07 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:07:47.674 15:44:07 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 15:44:07 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.674 15:44:07 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:07:47.674 { 00:07:47.674 "name": "Malloc1", 00:07:47.674 "aliases": [ 00:07:47.674 "f61d9d97-4cd4-4317-b6d2-eeca44f65f89" 00:07:47.674 ], 00:07:47.674 "product_name": "Malloc disk", 00:07:47.674 "block_size": 4096, 00:07:47.674 "num_blocks": 256, 00:07:47.674 "uuid": "f61d9d97-4cd4-4317-b6d2-eeca44f65f89", 00:07:47.674 "assigned_rate_limits": { 00:07:47.674 "rw_ios_per_sec": 0, 00:07:47.674 "rw_mbytes_per_sec": 0, 00:07:47.674 "r_mbytes_per_sec": 0, 00:07:47.674 "w_mbytes_per_sec": 0 00:07:47.674 }, 00:07:47.674 "claimed": false, 00:07:47.674 "zoned": false, 00:07:47.674 "supported_io_types": { 00:07:47.674 "read": true, 00:07:47.674 "write": true, 00:07:47.674 "unmap": true, 00:07:47.674 "flush": true, 00:07:47.674 "reset": true, 00:07:47.674 "nvme_admin": false, 00:07:47.674 "nvme_io": false, 00:07:47.674 "nvme_io_md": false, 00:07:47.674 "write_zeroes": true, 00:07:47.674 "zcopy": true, 00:07:47.674 "get_zone_info": false, 00:07:47.674 "zone_management": false, 00:07:47.674 "zone_append": false, 00:07:47.674 "compare": false, 00:07:47.674 "compare_and_write": false, 00:07:47.674 "abort": true, 00:07:47.674 "seek_hole": false, 00:07:47.674 "seek_data": false, 00:07:47.674 "copy": true, 00:07:47.674 "nvme_iov_md": false 00:07:47.674 }, 00:07:47.674 "memory_domains": [ 00:07:47.674 { 00:07:47.674 "dma_device_id": "system", 00:07:47.674 "dma_device_type": 1 00:07:47.674 }, 00:07:47.674 { 00:07:47.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:47.674 "dma_device_type": 2 00:07:47.674 } 00:07:47.674 ], 00:07:47.674 "driver_specific": {} 00:07:47.674 } 00:07:47.674 ]' 00:07:47.674 15:44:07 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:07:47.674 15:44:08 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:07:47.674 00:07:47.674 real 0m0.151s 00:07:47.674 user 0m0.096s 00:07:47.674 sys 0m0.018s 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:47.674 15:44:08 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:07:47.674 ************************************ 00:07:47.674 END TEST rpc_plugins 00:07:47.674 ************************************ 00:07:47.674 15:44:08 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:47.674 15:44:08 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:07:47.674 15:44:08 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:47.674 15:44:08 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:47.674 15:44:08 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:47.934 ************************************ 00:07:47.934 START TEST rpc_trace_cmd_test 00:07:47.934 ************************************ 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:47.934 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:07:47.934 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2452080", 00:07:47.934 "tpoint_group_mask": "0x8", 00:07:47.934 "iscsi_conn": { 00:07:47.934 "mask": "0x2", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "scsi": { 00:07:47.935 "mask": "0x4", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "bdev": { 00:07:47.935 "mask": "0x8", 00:07:47.935 "tpoint_mask": "0xffffffffffffffff" 00:07:47.935 }, 00:07:47.935 "nvmf_rdma": { 00:07:47.935 "mask": "0x10", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "nvmf_tcp": { 00:07:47.935 "mask": "0x20", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "ftl": { 00:07:47.935 "mask": "0x40", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "blobfs": { 00:07:47.935 "mask": "0x80", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "dsa": { 00:07:47.935 "mask": "0x200", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "thread": { 00:07:47.935 "mask": "0x400", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "nvme_pcie": { 00:07:47.935 "mask": "0x800", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "iaa": { 00:07:47.935 "mask": "0x1000", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "nvme_tcp": { 00:07:47.935 "mask": "0x2000", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "bdev_nvme": { 00:07:47.935 "mask": "0x4000", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 }, 00:07:47.935 "sock": { 00:07:47.935 "mask": "0x8000", 00:07:47.935 "tpoint_mask": "0x0" 00:07:47.935 } 00:07:47.935 }' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:07:47.935 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:07:48.196 15:44:08 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:07:48.196 00:07:48.196 real 0m0.247s 00:07:48.196 user 0m0.213s 00:07:48.196 sys 0m0.027s 00:07:48.196 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 ************************************ 00:07:48.196 END TEST rpc_trace_cmd_test 00:07:48.196 ************************************ 00:07:48.196 15:44:08 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:48.196 15:44:08 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:07:48.196 15:44:08 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:07:48.196 15:44:08 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:07:48.196 15:44:08 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:48.196 15:44:08 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.196 15:44:08 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 ************************************ 00:07:48.196 START TEST rpc_daemon_integrity 00:07:48.196 ************************************ 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:07:48.196 { 00:07:48.196 "name": "Malloc2", 00:07:48.196 "aliases": [ 00:07:48.196 "3c644354-e506-43c6-ad04-052e44ab10f1" 00:07:48.196 ], 00:07:48.196 "product_name": "Malloc disk", 00:07:48.196 "block_size": 512, 00:07:48.196 "num_blocks": 16384, 00:07:48.196 "uuid": "3c644354-e506-43c6-ad04-052e44ab10f1", 00:07:48.196 "assigned_rate_limits": { 00:07:48.196 "rw_ios_per_sec": 0, 00:07:48.196 "rw_mbytes_per_sec": 0, 00:07:48.196 "r_mbytes_per_sec": 0, 00:07:48.196 "w_mbytes_per_sec": 0 00:07:48.196 }, 00:07:48.196 "claimed": false, 00:07:48.196 "zoned": false, 00:07:48.196 "supported_io_types": { 00:07:48.196 "read": true, 00:07:48.196 "write": true, 00:07:48.196 "unmap": true, 00:07:48.196 "flush": true, 00:07:48.196 "reset": true, 00:07:48.196 "nvme_admin": false, 00:07:48.196 "nvme_io": false, 00:07:48.196 "nvme_io_md": false, 00:07:48.196 "write_zeroes": true, 00:07:48.196 "zcopy": true, 00:07:48.196 "get_zone_info": false, 00:07:48.196 "zone_management": false, 00:07:48.196 "zone_append": false, 00:07:48.196 "compare": false, 00:07:48.196 "compare_and_write": false, 00:07:48.196 "abort": true, 00:07:48.196 "seek_hole": false, 00:07:48.196 "seek_data": false, 00:07:48.196 "copy": true, 00:07:48.196 "nvme_iov_md": false 00:07:48.196 }, 00:07:48.196 "memory_domains": [ 00:07:48.196 { 00:07:48.196 "dma_device_id": "system", 00:07:48.196 "dma_device_type": 1 00:07:48.196 }, 00:07:48.196 { 00:07:48.196 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:48.196 "dma_device_type": 2 00:07:48.196 } 00:07:48.196 ], 00:07:48.196 "driver_specific": {} 00:07:48.196 } 00:07:48.196 ]' 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.196 [2024-07-12 15:44:08.619294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:07:48.196 [2024-07-12 15:44:08.619321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:48.196 [2024-07-12 15:44:08.619334] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0ba00 00:07:48.196 [2024-07-12 15:44:08.619340] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:48.196 [2024-07-12 15:44:08.620475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:48.196 [2024-07-12 15:44:08.620493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:07:48.196 Passthru0 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.196 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.457 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.457 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:07:48.457 { 00:07:48.457 "name": "Malloc2", 00:07:48.457 "aliases": [ 00:07:48.457 "3c644354-e506-43c6-ad04-052e44ab10f1" 00:07:48.457 ], 00:07:48.457 "product_name": "Malloc disk", 00:07:48.457 "block_size": 512, 00:07:48.457 "num_blocks": 16384, 00:07:48.457 "uuid": "3c644354-e506-43c6-ad04-052e44ab10f1", 00:07:48.457 "assigned_rate_limits": { 00:07:48.457 "rw_ios_per_sec": 0, 00:07:48.457 "rw_mbytes_per_sec": 0, 00:07:48.457 "r_mbytes_per_sec": 0, 00:07:48.457 "w_mbytes_per_sec": 0 00:07:48.457 }, 00:07:48.457 "claimed": true, 00:07:48.457 "claim_type": "exclusive_write", 00:07:48.457 "zoned": false, 00:07:48.457 "supported_io_types": { 00:07:48.457 "read": true, 00:07:48.457 "write": true, 00:07:48.457 "unmap": true, 00:07:48.457 "flush": true, 00:07:48.457 "reset": true, 00:07:48.457 "nvme_admin": false, 00:07:48.457 "nvme_io": false, 00:07:48.457 "nvme_io_md": false, 00:07:48.457 "write_zeroes": true, 00:07:48.457 "zcopy": true, 00:07:48.457 "get_zone_info": false, 00:07:48.457 "zone_management": false, 00:07:48.457 "zone_append": false, 00:07:48.457 "compare": false, 00:07:48.457 "compare_and_write": false, 00:07:48.457 "abort": true, 00:07:48.457 "seek_hole": false, 00:07:48.457 "seek_data": false, 00:07:48.457 "copy": true, 00:07:48.457 "nvme_iov_md": false 00:07:48.457 }, 00:07:48.457 "memory_domains": [ 00:07:48.457 { 00:07:48.457 "dma_device_id": "system", 00:07:48.457 "dma_device_type": 1 00:07:48.457 }, 00:07:48.457 { 00:07:48.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:48.457 "dma_device_type": 2 00:07:48.457 } 00:07:48.457 ], 00:07:48.457 "driver_specific": {} 00:07:48.457 }, 00:07:48.457 { 00:07:48.457 "name": "Passthru0", 00:07:48.457 "aliases": [ 00:07:48.457 "bc7d4080-5148-54e8-ae4d-eba757b173cd" 00:07:48.457 ], 00:07:48.457 "product_name": "passthru", 00:07:48.457 "block_size": 512, 00:07:48.457 "num_blocks": 16384, 00:07:48.457 "uuid": "bc7d4080-5148-54e8-ae4d-eba757b173cd", 00:07:48.457 "assigned_rate_limits": { 00:07:48.457 "rw_ios_per_sec": 0, 00:07:48.457 "rw_mbytes_per_sec": 0, 00:07:48.457 "r_mbytes_per_sec": 0, 00:07:48.457 "w_mbytes_per_sec": 0 00:07:48.457 }, 00:07:48.457 "claimed": false, 00:07:48.457 "zoned": false, 00:07:48.457 "supported_io_types": { 00:07:48.457 "read": true, 00:07:48.457 "write": true, 00:07:48.457 "unmap": true, 00:07:48.457 "flush": true, 00:07:48.457 "reset": true, 00:07:48.457 "nvme_admin": false, 00:07:48.457 "nvme_io": false, 00:07:48.457 "nvme_io_md": false, 00:07:48.457 "write_zeroes": true, 00:07:48.457 "zcopy": true, 00:07:48.457 "get_zone_info": false, 00:07:48.457 "zone_management": false, 00:07:48.457 "zone_append": false, 00:07:48.457 "compare": false, 00:07:48.457 "compare_and_write": false, 00:07:48.457 "abort": true, 00:07:48.457 "seek_hole": false, 00:07:48.457 "seek_data": false, 00:07:48.457 "copy": true, 00:07:48.457 "nvme_iov_md": false 00:07:48.457 }, 00:07:48.457 "memory_domains": [ 00:07:48.457 { 00:07:48.457 "dma_device_id": "system", 00:07:48.457 "dma_device_type": 1 00:07:48.457 }, 00:07:48.457 { 00:07:48.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:07:48.457 "dma_device_type": 2 00:07:48.457 } 00:07:48.457 ], 00:07:48.457 "driver_specific": { 00:07:48.457 "passthru": { 00:07:48.457 "name": "Passthru0", 00:07:48.457 "base_bdev_name": "Malloc2" 00:07:48.457 } 00:07:48.457 } 00:07:48.458 } 00:07:48.458 ]' 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:07:48.458 00:07:48.458 real 0m0.289s 00:07:48.458 user 0m0.183s 00:07:48.458 sys 0m0.044s 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.458 15:44:08 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:07:48.458 ************************************ 00:07:48.458 END TEST rpc_daemon_integrity 00:07:48.458 ************************************ 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:48.458 15:44:08 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:07:48.458 15:44:08 rpc -- rpc/rpc.sh@84 -- # killprocess 2452080 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@948 -- # '[' -z 2452080 ']' 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@952 -- # kill -0 2452080 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@953 -- # uname 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2452080 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2452080' 00:07:48.458 killing process with pid 2452080 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@967 -- # kill 2452080 00:07:48.458 15:44:08 rpc -- common/autotest_common.sh@972 -- # wait 2452080 00:07:48.718 00:07:48.718 real 0m2.523s 00:07:48.718 user 0m3.343s 00:07:48.718 sys 0m0.714s 00:07:48.718 15:44:09 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:48.718 15:44:09 rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.718 ************************************ 00:07:48.718 END TEST rpc 00:07:48.718 ************************************ 00:07:48.719 15:44:09 -- common/autotest_common.sh@1142 -- # return 0 00:07:48.719 15:44:09 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:48.719 15:44:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:48.719 15:44:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.719 15:44:09 -- common/autotest_common.sh@10 -- # set +x 00:07:48.719 ************************************ 00:07:48.719 START TEST skip_rpc 00:07:48.719 ************************************ 00:07:48.719 15:44:09 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:07:48.979 * Looking for test storage... 00:07:48.979 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:07:48.979 15:44:09 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:48.979 15:44:09 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:48.979 15:44:09 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:07:48.979 15:44:09 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:48.979 15:44:09 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:48.979 15:44:09 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:48.979 ************************************ 00:07:48.979 START TEST skip_rpc 00:07:48.979 ************************************ 00:07:48.979 15:44:09 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:07:48.979 15:44:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2452590 00:07:48.979 15:44:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:48.979 15:44:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:07:48.979 15:44:09 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:07:48.979 [2024-07-12 15:44:09.334646] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:07:48.979 [2024-07-12 15:44:09.334700] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2452590 ] 00:07:49.239 [2024-07-12 15:44:09.427533] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.239 [2024-07-12 15:44:09.500436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2452590 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 2452590 ']' 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 2452590 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2452590 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2452590' 00:07:54.520 killing process with pid 2452590 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 2452590 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 2452590 00:07:54.520 00:07:54.520 real 0m5.275s 00:07:54.520 user 0m5.056s 00:07:54.520 sys 0m0.242s 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:54.520 15:44:14 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:54.520 ************************************ 00:07:54.520 END TEST skip_rpc 00:07:54.520 ************************************ 00:07:54.520 15:44:14 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:54.520 15:44:14 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:54.520 15:44:14 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:54.520 15:44:14 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:54.520 15:44:14 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:54.520 ************************************ 00:07:54.520 START TEST skip_rpc_with_json 00:07:54.520 ************************************ 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2453530 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2453530 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 2453530 ']' 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:54.520 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:54.520 15:44:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:54.520 [2024-07-12 15:44:14.685728] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:07:54.520 [2024-07-12 15:44:14.685784] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2453530 ] 00:07:54.520 [2024-07-12 15:44:14.777088] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.520 [2024-07-12 15:44:14.854899] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:55.092 [2024-07-12 15:44:15.527169] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:55.092 request: 00:07:55.092 { 00:07:55.092 "trtype": "tcp", 00:07:55.092 "method": "nvmf_get_transports", 00:07:55.092 "req_id": 1 00:07:55.092 } 00:07:55.092 Got JSON-RPC error response 00:07:55.092 response: 00:07:55.092 { 00:07:55.092 "code": -19, 00:07:55.092 "message": "No such device" 00:07:55.092 } 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:55.092 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:55.092 [2024-07-12 15:44:15.539289] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:55.352 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:55.352 { 00:07:55.352 "subsystems": [ 00:07:55.352 { 00:07:55.352 "subsystem": "keyring", 00:07:55.352 "config": [] 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "subsystem": "iobuf", 00:07:55.352 "config": [ 00:07:55.352 { 00:07:55.352 "method": "iobuf_set_options", 00:07:55.352 "params": { 00:07:55.352 "small_pool_count": 8192, 00:07:55.352 "large_pool_count": 1024, 00:07:55.352 "small_bufsize": 8192, 00:07:55.352 "large_bufsize": 135168 00:07:55.352 } 00:07:55.352 } 00:07:55.352 ] 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "subsystem": "sock", 00:07:55.352 "config": [ 00:07:55.352 { 00:07:55.352 "method": "sock_set_default_impl", 00:07:55.352 "params": { 00:07:55.352 "impl_name": "posix" 00:07:55.352 } 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "method": "sock_impl_set_options", 00:07:55.352 "params": { 00:07:55.352 "impl_name": "ssl", 00:07:55.352 "recv_buf_size": 4096, 00:07:55.352 "send_buf_size": 4096, 00:07:55.352 "enable_recv_pipe": true, 00:07:55.352 "enable_quickack": false, 00:07:55.352 "enable_placement_id": 0, 00:07:55.352 "enable_zerocopy_send_server": true, 00:07:55.352 "enable_zerocopy_send_client": false, 00:07:55.352 "zerocopy_threshold": 0, 00:07:55.352 "tls_version": 0, 00:07:55.352 "enable_ktls": false 00:07:55.352 } 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "method": "sock_impl_set_options", 00:07:55.352 "params": { 00:07:55.352 "impl_name": "posix", 00:07:55.352 "recv_buf_size": 2097152, 00:07:55.352 "send_buf_size": 2097152, 00:07:55.352 "enable_recv_pipe": true, 00:07:55.352 "enable_quickack": false, 00:07:55.352 "enable_placement_id": 0, 00:07:55.352 "enable_zerocopy_send_server": true, 00:07:55.352 "enable_zerocopy_send_client": false, 00:07:55.352 "zerocopy_threshold": 0, 00:07:55.352 "tls_version": 0, 00:07:55.352 "enable_ktls": false 00:07:55.352 } 00:07:55.352 } 00:07:55.352 ] 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "subsystem": "vmd", 00:07:55.352 "config": [] 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "subsystem": "accel", 00:07:55.352 "config": [ 00:07:55.352 { 00:07:55.352 "method": "accel_set_options", 00:07:55.352 "params": { 00:07:55.352 "small_cache_size": 128, 00:07:55.352 "large_cache_size": 16, 00:07:55.352 "task_count": 2048, 00:07:55.352 "sequence_count": 2048, 00:07:55.352 "buf_count": 2048 00:07:55.352 } 00:07:55.352 } 00:07:55.352 ] 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "subsystem": "bdev", 00:07:55.352 "config": [ 00:07:55.352 { 00:07:55.352 "method": "bdev_set_options", 00:07:55.352 "params": { 00:07:55.352 "bdev_io_pool_size": 65535, 00:07:55.352 "bdev_io_cache_size": 256, 00:07:55.352 "bdev_auto_examine": true, 00:07:55.352 "iobuf_small_cache_size": 128, 00:07:55.352 "iobuf_large_cache_size": 16 00:07:55.352 } 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "method": "bdev_raid_set_options", 00:07:55.352 "params": { 00:07:55.352 "process_window_size_kb": 1024 00:07:55.352 } 00:07:55.352 }, 00:07:55.352 { 00:07:55.352 "method": "bdev_iscsi_set_options", 00:07:55.352 "params": { 00:07:55.352 "timeout_sec": 30 00:07:55.352 } 00:07:55.352 }, 00:07:55.353 { 00:07:55.353 "method": "bdev_nvme_set_options", 00:07:55.353 "params": { 00:07:55.353 "action_on_timeout": "none", 00:07:55.353 "timeout_us": 0, 00:07:55.353 "timeout_admin_us": 0, 00:07:55.353 "keep_alive_timeout_ms": 10000, 00:07:55.353 "arbitration_burst": 0, 00:07:55.353 "low_priority_weight": 0, 00:07:55.353 "medium_priority_weight": 0, 00:07:55.353 "high_priority_weight": 0, 00:07:55.353 "nvme_adminq_poll_period_us": 10000, 00:07:55.353 "nvme_ioq_poll_period_us": 0, 00:07:55.353 "io_queue_requests": 0, 00:07:55.353 "delay_cmd_submit": true, 00:07:55.353 "transport_retry_count": 4, 00:07:55.353 "bdev_retry_count": 3, 00:07:55.353 "transport_ack_timeout": 0, 00:07:55.353 "ctrlr_loss_timeout_sec": 0, 00:07:55.353 "reconnect_delay_sec": 0, 00:07:55.353 "fast_io_fail_timeout_sec": 0, 00:07:55.353 "disable_auto_failback": false, 00:07:55.353 "generate_uuids": false, 00:07:55.353 "transport_tos": 0, 00:07:55.353 "nvme_error_stat": false, 00:07:55.353 "rdma_srq_size": 0, 00:07:55.353 "io_path_stat": false, 00:07:55.353 "allow_accel_sequence": false, 00:07:55.353 "rdma_max_cq_size": 0, 00:07:55.353 "rdma_cm_event_timeout_ms": 0, 00:07:55.353 "dhchap_digests": [ 00:07:55.353 "sha256", 00:07:55.353 "sha384", 00:07:55.353 "sha512" 00:07:55.353 ], 00:07:55.353 "dhchap_dhgroups": [ 00:07:55.353 "null", 00:07:55.353 "ffdhe2048", 00:07:55.353 "ffdhe3072", 00:07:55.353 "ffdhe4096", 00:07:55.353 "ffdhe6144", 00:07:55.353 "ffdhe8192" 00:07:55.353 ] 00:07:55.353 } 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "method": "bdev_nvme_set_hotplug", 00:07:55.353 "params": { 00:07:55.353 "period_us": 100000, 00:07:55.353 "enable": false 00:07:55.353 } 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "method": "bdev_wait_for_examine" 00:07:55.353 } 00:07:55.353 ] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "scsi", 00:07:55.353 "config": null 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "scheduler", 00:07:55.353 "config": [ 00:07:55.353 { 00:07:55.353 "method": "framework_set_scheduler", 00:07:55.353 "params": { 00:07:55.353 "name": "static" 00:07:55.353 } 00:07:55.353 } 00:07:55.353 ] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "vhost_scsi", 00:07:55.353 "config": [] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "vhost_blk", 00:07:55.353 "config": [] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "ublk", 00:07:55.353 "config": [] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "nbd", 00:07:55.353 "config": [] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "nvmf", 00:07:55.353 "config": [ 00:07:55.353 { 00:07:55.353 "method": "nvmf_set_config", 00:07:55.353 "params": { 00:07:55.353 "discovery_filter": "match_any", 00:07:55.353 "admin_cmd_passthru": { 00:07:55.353 "identify_ctrlr": false 00:07:55.353 } 00:07:55.353 } 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "method": "nvmf_set_max_subsystems", 00:07:55.353 "params": { 00:07:55.353 "max_subsystems": 1024 00:07:55.353 } 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "method": "nvmf_set_crdt", 00:07:55.353 "params": { 00:07:55.353 "crdt1": 0, 00:07:55.353 "crdt2": 0, 00:07:55.353 "crdt3": 0 00:07:55.353 } 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "method": "nvmf_create_transport", 00:07:55.353 "params": { 00:07:55.353 "trtype": "TCP", 00:07:55.353 "max_queue_depth": 128, 00:07:55.353 "max_io_qpairs_per_ctrlr": 127, 00:07:55.353 "in_capsule_data_size": 4096, 00:07:55.353 "max_io_size": 131072, 00:07:55.353 "io_unit_size": 131072, 00:07:55.353 "max_aq_depth": 128, 00:07:55.353 "num_shared_buffers": 511, 00:07:55.353 "buf_cache_size": 4294967295, 00:07:55.353 "dif_insert_or_strip": false, 00:07:55.353 "zcopy": false, 00:07:55.353 "c2h_success": true, 00:07:55.353 "sock_priority": 0, 00:07:55.353 "abort_timeout_sec": 1, 00:07:55.353 "ack_timeout": 0, 00:07:55.353 "data_wr_pool_size": 0 00:07:55.353 } 00:07:55.353 } 00:07:55.353 ] 00:07:55.353 }, 00:07:55.353 { 00:07:55.353 "subsystem": "iscsi", 00:07:55.353 "config": [ 00:07:55.353 { 00:07:55.353 "method": "iscsi_set_options", 00:07:55.353 "params": { 00:07:55.353 "node_base": "iqn.2016-06.io.spdk", 00:07:55.353 "max_sessions": 128, 00:07:55.353 "max_connections_per_session": 2, 00:07:55.353 "max_queue_depth": 64, 00:07:55.353 "default_time2wait": 2, 00:07:55.353 "default_time2retain": 20, 00:07:55.353 "first_burst_length": 8192, 00:07:55.353 "immediate_data": true, 00:07:55.353 "allow_duplicated_isid": false, 00:07:55.353 "error_recovery_level": 0, 00:07:55.353 "nop_timeout": 60, 00:07:55.353 "nop_in_interval": 30, 00:07:55.353 "disable_chap": false, 00:07:55.353 "require_chap": false, 00:07:55.353 "mutual_chap": false, 00:07:55.353 "chap_group": 0, 00:07:55.353 "max_large_datain_per_connection": 64, 00:07:55.353 "max_r2t_per_connection": 4, 00:07:55.353 "pdu_pool_size": 36864, 00:07:55.353 "immediate_data_pool_size": 16384, 00:07:55.353 "data_out_pool_size": 2048 00:07:55.353 } 00:07:55.353 } 00:07:55.353 ] 00:07:55.353 } 00:07:55.353 ] 00:07:55.353 } 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2453530 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2453530 ']' 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2453530 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2453530 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2453530' 00:07:55.353 killing process with pid 2453530 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2453530 00:07:55.353 15:44:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2453530 00:07:55.613 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2453821 00:07:55.613 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:55.613 15:44:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2453821 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 2453821 ']' 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 2453821 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:00.897 15:44:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2453821 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2453821' 00:08:00.897 killing process with pid 2453821 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 2453821 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 2453821 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:08:00.897 00:08:00.897 real 0m6.589s 00:08:00.897 user 0m6.456s 00:08:00.897 sys 0m0.586s 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:08:00.897 ************************************ 00:08:00.897 END TEST skip_rpc_with_json 00:08:00.897 ************************************ 00:08:00.897 15:44:21 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:00.897 15:44:21 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:08:00.897 15:44:21 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:00.897 15:44:21 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:00.897 15:44:21 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:00.897 ************************************ 00:08:00.897 START TEST skip_rpc_with_delay 00:08:00.897 ************************************ 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:00.897 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:08:00.898 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:08:01.158 [2024-07-12 15:44:21.360917] app.c: 836:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:08:01.158 [2024-07-12 15:44:21.361029] app.c: 715:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:01.158 00:08:01.158 real 0m0.086s 00:08:01.158 user 0m0.059s 00:08:01.158 sys 0m0.026s 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:01.158 15:44:21 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:08:01.158 ************************************ 00:08:01.158 END TEST skip_rpc_with_delay 00:08:01.158 ************************************ 00:08:01.158 15:44:21 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:01.158 15:44:21 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:08:01.158 15:44:21 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:08:01.158 15:44:21 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:08:01.158 15:44:21 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:01.158 15:44:21 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:01.158 15:44:21 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:01.158 ************************************ 00:08:01.158 START TEST exit_on_failed_rpc_init 00:08:01.158 ************************************ 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2454791 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2454791 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 2454791 ']' 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:01.158 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:01.158 15:44:21 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:08:01.158 [2024-07-12 15:44:21.517469] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:01.158 [2024-07-12 15:44:21.517512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2454791 ] 00:08:01.158 [2024-07-12 15:44:21.602242] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.418 [2024-07-12 15:44:21.665014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:08:01.989 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:08:01.989 [2024-07-12 15:44:22.418515] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:01.989 [2024-07-12 15:44:22.418563] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2455003 ] 00:08:02.250 [2024-07-12 15:44:22.492144] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.250 [2024-07-12 15:44:22.562456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:02.250 [2024-07-12 15:44:22.562520] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:08:02.250 [2024-07-12 15:44:22.562531] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:08:02.250 [2024-07-12 15:44:22.562538] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2454791 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 2454791 ']' 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 2454791 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2454791 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2454791' 00:08:02.250 killing process with pid 2454791 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 2454791 00:08:02.250 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 2454791 00:08:02.511 00:08:02.511 real 0m1.425s 00:08:02.511 user 0m1.713s 00:08:02.511 sys 0m0.387s 00:08:02.511 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.511 15:44:22 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:08:02.511 ************************************ 00:08:02.511 END TEST exit_on_failed_rpc_init 00:08:02.511 ************************************ 00:08:02.511 15:44:22 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:08:02.511 15:44:22 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:08:02.511 00:08:02.511 real 0m13.785s 00:08:02.511 user 0m13.441s 00:08:02.511 sys 0m1.517s 00:08:02.511 15:44:22 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.511 15:44:22 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:02.511 ************************************ 00:08:02.511 END TEST skip_rpc 00:08:02.511 ************************************ 00:08:02.773 15:44:22 -- common/autotest_common.sh@1142 -- # return 0 00:08:02.773 15:44:22 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:02.773 15:44:22 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:02.773 15:44:22 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.773 15:44:22 -- common/autotest_common.sh@10 -- # set +x 00:08:02.773 ************************************ 00:08:02.773 START TEST rpc_client 00:08:02.773 ************************************ 00:08:02.773 15:44:22 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:08:02.773 * Looking for test storage... 00:08:02.774 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:08:02.774 15:44:23 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:08:02.774 OK 00:08:02.774 15:44:23 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:08:02.774 00:08:02.774 real 0m0.124s 00:08:02.774 user 0m0.049s 00:08:02.774 sys 0m0.084s 00:08:02.774 15:44:23 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:02.774 15:44:23 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:08:02.774 ************************************ 00:08:02.774 END TEST rpc_client 00:08:02.774 ************************************ 00:08:02.774 15:44:23 -- common/autotest_common.sh@1142 -- # return 0 00:08:02.774 15:44:23 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:02.774 15:44:23 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:02.774 15:44:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:02.774 15:44:23 -- common/autotest_common.sh@10 -- # set +x 00:08:02.774 ************************************ 00:08:02.774 START TEST json_config 00:08:02.774 ************************************ 00:08:02.774 15:44:23 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@7 -- # uname -s 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:03.036 15:44:23 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:03.036 15:44:23 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:03.036 15:44:23 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:03.036 15:44:23 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.036 15:44:23 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.036 15:44:23 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.036 15:44:23 json_config -- paths/export.sh@5 -- # export PATH 00:08:03.036 15:44:23 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@47 -- # : 0 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:03.036 15:44:23 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:08:03.036 15:44:23 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:03.037 15:44:23 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:08:03.037 INFO: JSON configuration test init 00:08:03.037 15:44:23 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:08:03.037 15:44:23 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.037 15:44:23 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.037 15:44:23 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:08:03.037 15:44:23 json_config -- json_config/common.sh@9 -- # local app=target 00:08:03.037 15:44:23 json_config -- json_config/common.sh@10 -- # shift 00:08:03.037 15:44:23 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:03.037 15:44:23 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:03.037 15:44:23 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:03.037 15:44:23 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:03.037 15:44:23 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:03.037 15:44:23 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2455215 00:08:03.037 15:44:23 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:03.037 Waiting for target to run... 00:08:03.037 15:44:23 json_config -- json_config/common.sh@25 -- # waitforlisten 2455215 /var/tmp/spdk_tgt.sock 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@829 -- # '[' -z 2455215 ']' 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:03.037 15:44:23 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:03.037 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:03.037 15:44:23 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.037 [2024-07-12 15:44:23.379520] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:03.037 [2024-07-12 15:44:23.379569] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2455215 ] 00:08:03.298 [2024-07-12 15:44:23.732297] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.558 [2024-07-12 15:44:23.783390] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.819 15:44:24 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:03.819 15:44:24 json_config -- common/autotest_common.sh@862 -- # return 0 00:08:03.819 15:44:24 json_config -- json_config/common.sh@26 -- # echo '' 00:08:03.819 00:08:03.819 15:44:24 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:08:03.819 15:44:24 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:08:03.819 15:44:24 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:03.819 15:44:24 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.819 15:44:24 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:08:03.819 15:44:24 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:08:03.819 15:44:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:08:04.079 15:44:24 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:04.079 15:44:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:08:04.339 [2024-07-12 15:44:24.581681] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:04.339 15:44:24 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:04.339 15:44:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:08:04.339 [2024-07-12 15:44:24.766130] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:04.339 15:44:24 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:08:04.339 15:44:24 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:04.339 15:44:24 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:04.599 15:44:24 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:08:04.599 15:44:24 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:08:04.599 15:44:24 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:08:04.599 [2024-07-12 15:44:25.018533] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:08:09.983 15:44:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@48 -- # local get_types 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@55 -- # return 0 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:09.983 15:44:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:08:09.983 15:44:30 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:09.983 15:44:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:08:10.243 15:44:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:08:10.243 Nvme0n1p0 Nvme0n1p1 00:08:10.243 15:44:30 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:08:10.243 15:44:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:08:10.503 [2024-07-12 15:44:30.842368] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:10.503 [2024-07-12 15:44:30.842409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:10.503 00:08:10.503 15:44:30 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:08:10.503 15:44:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:08:10.762 Malloc3 00:08:10.762 15:44:31 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:10.762 15:44:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:08:11.023 [2024-07-12 15:44:31.219382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:11.023 [2024-07-12 15:44:31.219418] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:11.023 [2024-07-12 15:44:31.219432] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15d7d50 00:08:11.023 [2024-07-12 15:44:31.219438] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:11.023 [2024-07-12 15:44:31.220658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:11.023 [2024-07-12 15:44:31.220679] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:11.023 PTBdevFromMalloc3 00:08:11.023 15:44:31 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:08:11.023 15:44:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:08:11.023 Null0 00:08:11.023 15:44:31 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:08:11.023 15:44:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:08:11.282 Malloc0 00:08:11.282 15:44:31 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:08:11.282 15:44:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:08:11.541 Malloc1 00:08:11.541 15:44:31 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:08:11.541 15:44:31 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:08:11.541 102400+0 records in 00:08:11.541 102400+0 records out 00:08:11.541 104857600 bytes (105 MB, 100 MiB) copied, 0.120116 s, 873 MB/s 00:08:11.541 15:44:31 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:08:11.541 15:44:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:08:11.802 aio_disk 00:08:11.802 15:44:32 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:08:11.802 15:44:32 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:11.802 15:44:32 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:16.002 45142155-a53b-4bcd-a5e0-6fce6c158c63 00:08:16.002 15:44:36 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:08:16.002 15:44:36 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:08:16.002 15:44:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:08:16.002 15:44:36 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:08:16.002 15:44:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:08:16.262 15:44:36 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:16.262 15:44:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:16.522 15:44:36 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:16.522 15:44:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:16.782 15:44:36 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:08:16.782 15:44:36 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:16.782 15:44:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:16.782 MallocForCryptoBdev 00:08:16.782 15:44:37 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:08:16.782 15:44:37 json_config -- json_config/json_config.sh@159 -- # wc -l 00:08:16.782 15:44:37 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:08:16.782 15:44:37 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:08:16.782 15:44:37 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:16.782 15:44:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:17.043 [2024-07-12 15:44:37.380073] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:08:17.043 CryptoMallocBdev 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@71 -- # sort 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@72 -- # sort 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:17.043 15:44:37 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:08:17.043 15:44:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:17.304 15:44:37 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 bdev_register:aio_disk bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea bdev_register:CryptoMallocBdev bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\6\6\f\0\3\8\1\8\-\8\3\a\2\-\4\3\d\0\-\b\3\1\e\-\6\c\9\6\7\c\b\e\1\5\4\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\0\7\a\9\c\d\9\-\f\d\1\f\-\4\5\1\2\-\8\2\b\2\-\c\a\1\a\9\3\0\b\5\8\9\9\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\9\d\2\0\a\8\4\-\0\8\4\7\-\4\7\2\4\-\a\c\8\f\-\d\2\2\5\6\3\5\9\6\2\e\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\f\2\f\b\5\4\4\7\-\3\c\3\c\-\4\9\4\5\-\9\7\1\d\-\6\3\3\7\a\6\0\4\6\d\0\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@86 -- # cat 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 bdev_register:aio_disk bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea bdev_register:CryptoMallocBdev bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:08:17.305 Expected events matched: 00:08:17.305 bdev_register:66f03818-83a2-43d0-b31e-6c967cbe1548 00:08:17.305 bdev_register:707a9cd9-fd1f-4512-82b2-ca1a930b5899 00:08:17.305 bdev_register:aio_disk 00:08:17.305 bdev_register:b9d20a84-0847-4724-ac8f-d225635962ea 00:08:17.305 bdev_register:CryptoMallocBdev 00:08:17.305 bdev_register:f2fb5447-3c3c-4945-971d-6337a6046d0b 00:08:17.305 bdev_register:Malloc0 00:08:17.305 bdev_register:Malloc0p0 00:08:17.305 bdev_register:Malloc0p1 00:08:17.305 bdev_register:Malloc0p2 00:08:17.305 bdev_register:Malloc1 00:08:17.305 bdev_register:Malloc3 00:08:17.305 bdev_register:MallocForCryptoBdev 00:08:17.305 bdev_register:Null0 00:08:17.305 bdev_register:Nvme0n1 00:08:17.305 bdev_register:Nvme0n1p0 00:08:17.305 bdev_register:Nvme0n1p1 00:08:17.305 bdev_register:PTBdevFromMalloc3 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:08:17.305 15:44:37 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:17.305 15:44:37 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:08:17.305 15:44:37 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:17.305 15:44:37 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:08:17.305 15:44:37 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:17.305 15:44:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:17.565 MallocBdevForConfigChangeCheck 00:08:17.565 15:44:37 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:08:17.565 15:44:37 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:17.565 15:44:37 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:17.565 15:44:37 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:08:17.565 15:44:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:17.827 15:44:38 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:08:17.827 INFO: shutting down applications... 00:08:17.827 15:44:38 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:08:17.827 15:44:38 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:08:17.827 15:44:38 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:08:17.827 15:44:38 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:08:18.087 [2024-07-12 15:44:38.415075] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:08:20.627 Calling clear_iscsi_subsystem 00:08:20.627 Calling clear_nvmf_subsystem 00:08:20.627 Calling clear_nbd_subsystem 00:08:20.627 Calling clear_ublk_subsystem 00:08:20.627 Calling clear_vhost_blk_subsystem 00:08:20.627 Calling clear_vhost_scsi_subsystem 00:08:20.627 Calling clear_bdev_subsystem 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@343 -- # count=100 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:08:20.627 15:44:40 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:08:20.886 15:44:41 json_config -- json_config/json_config.sh@345 -- # break 00:08:20.886 15:44:41 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:08:20.886 15:44:41 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:08:20.886 15:44:41 json_config -- json_config/common.sh@31 -- # local app=target 00:08:20.886 15:44:41 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:20.886 15:44:41 json_config -- json_config/common.sh@35 -- # [[ -n 2455215 ]] 00:08:20.886 15:44:41 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2455215 00:08:20.886 15:44:41 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:20.886 15:44:41 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:20.886 15:44:41 json_config -- json_config/common.sh@41 -- # kill -0 2455215 00:08:20.886 15:44:41 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:08:21.458 15:44:41 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:08:21.458 15:44:41 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:21.458 15:44:41 json_config -- json_config/common.sh@41 -- # kill -0 2455215 00:08:21.458 15:44:41 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:21.458 15:44:41 json_config -- json_config/common.sh@43 -- # break 00:08:21.458 15:44:41 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:21.458 15:44:41 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:21.458 SPDK target shutdown done 00:08:21.458 15:44:41 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:08:21.458 INFO: relaunching applications... 00:08:21.458 15:44:41 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:21.458 15:44:41 json_config -- json_config/common.sh@9 -- # local app=target 00:08:21.458 15:44:41 json_config -- json_config/common.sh@10 -- # shift 00:08:21.458 15:44:41 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:21.458 15:44:41 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:21.458 15:44:41 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:21.458 15:44:41 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:21.458 15:44:41 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:21.458 15:44:41 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2458452 00:08:21.458 15:44:41 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:21.458 Waiting for target to run... 00:08:21.458 15:44:41 json_config -- json_config/common.sh@25 -- # waitforlisten 2458452 /var/tmp/spdk_tgt.sock 00:08:21.458 15:44:41 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@829 -- # '[' -z 2458452 ']' 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:21.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:21.458 15:44:41 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:21.458 [2024-07-12 15:44:41.890815] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:21.458 [2024-07-12 15:44:41.890883] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2458452 ] 00:08:22.029 [2024-07-12 15:44:42.337865] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.029 [2024-07-12 15:44:42.392222] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.029 [2024-07-12 15:44:42.446055] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:08:22.029 [2024-07-12 15:44:42.454090] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:22.029 [2024-07-12 15:44:42.462108] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:22.289 [2024-07-12 15:44:42.542283] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:24.832 [2024-07-12 15:44:44.677998] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:24.832 [2024-07-12 15:44:44.678043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:24.832 [2024-07-12 15:44:44.678051] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:24.832 [2024-07-12 15:44:44.686014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:24.832 [2024-07-12 15:44:44.686032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:24.832 [2024-07-12 15:44:44.694026] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:24.832 [2024-07-12 15:44:44.694042] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:24.832 [2024-07-12 15:44:44.702058] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:08:24.832 [2024-07-12 15:44:44.702075] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:08:24.832 [2024-07-12 15:44:44.702081] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:27.375 [2024-07-12 15:44:47.560103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:27.375 [2024-07-12 15:44:47.560140] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:27.375 [2024-07-12 15:44:47.560151] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xee1de0 00:08:27.375 [2024-07-12 15:44:47.560158] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:27.375 [2024-07-12 15:44:47.560388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:27.375 [2024-07-12 15:44:47.560402] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:27.375 15:44:47 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:27.375 15:44:47 json_config -- common/autotest_common.sh@862 -- # return 0 00:08:27.375 15:44:47 json_config -- json_config/common.sh@26 -- # echo '' 00:08:27.375 00:08:27.375 15:44:47 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:08:27.375 15:44:47 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:08:27.375 INFO: Checking if target configuration is the same... 00:08:27.375 15:44:47 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.375 15:44:47 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:08:27.375 15:44:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:27.375 + '[' 2 -ne 2 ']' 00:08:27.375 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:27.375 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:27.375 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:27.375 +++ basename /dev/fd/62 00:08:27.375 ++ mktemp /tmp/62.XXX 00:08:27.375 + tmp_file_1=/tmp/62.4VO 00:08:27.375 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.375 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:27.375 + tmp_file_2=/tmp/spdk_tgt_config.json.w7I 00:08:27.375 + ret=0 00:08:27.375 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:27.635 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:27.896 + diff -u /tmp/62.4VO /tmp/spdk_tgt_config.json.w7I 00:08:27.896 + echo 'INFO: JSON config files are the same' 00:08:27.896 INFO: JSON config files are the same 00:08:27.896 + rm /tmp/62.4VO /tmp/spdk_tgt_config.json.w7I 00:08:27.896 + exit 0 00:08:27.896 15:44:48 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:08:27.896 15:44:48 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:08:27.896 INFO: changing configuration and checking if this can be detected... 00:08:27.896 15:44:48 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:27.896 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:27.896 15:44:48 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:08:27.896 15:44:48 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.896 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:27.896 + '[' 2 -ne 2 ']' 00:08:27.896 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:27.896 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:27.896 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:27.896 +++ basename /dev/fd/62 00:08:27.896 ++ mktemp /tmp/62.XXX 00:08:27.896 + tmp_file_1=/tmp/62.mK3 00:08:27.896 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:27.896 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:27.896 + tmp_file_2=/tmp/spdk_tgt_config.json.gS5 00:08:27.896 + ret=0 00:08:27.896 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:28.156 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:28.416 + diff -u /tmp/62.mK3 /tmp/spdk_tgt_config.json.gS5 00:08:28.417 + ret=1 00:08:28.417 + echo '=== Start of file: /tmp/62.mK3 ===' 00:08:28.417 + cat /tmp/62.mK3 00:08:28.417 + echo '=== End of file: /tmp/62.mK3 ===' 00:08:28.417 + echo '' 00:08:28.417 + echo '=== Start of file: /tmp/spdk_tgt_config.json.gS5 ===' 00:08:28.417 + cat /tmp/spdk_tgt_config.json.gS5 00:08:28.417 + echo '=== End of file: /tmp/spdk_tgt_config.json.gS5 ===' 00:08:28.417 + echo '' 00:08:28.417 + rm /tmp/62.mK3 /tmp/spdk_tgt_config.json.gS5 00:08:28.417 + exit 1 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:08:28.417 INFO: configuration change detected. 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:08:28.417 15:44:48 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:28.417 15:44:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@317 -- # [[ -n 2458452 ]] 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:08:28.417 15:44:48 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:28.417 15:44:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:08:28.417 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:08:28.417 15:44:48 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:08:28.417 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:08:28.677 15:44:48 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:08:28.677 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:08:28.938 15:44:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@193 -- # uname -s 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:28.938 15:44:49 json_config -- json_config/json_config.sh@323 -- # killprocess 2458452 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@948 -- # '[' -z 2458452 ']' 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@952 -- # kill -0 2458452 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@953 -- # uname 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2458452 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2458452' 00:08:28.938 killing process with pid 2458452 00:08:28.938 15:44:49 json_config -- common/autotest_common.sh@967 -- # kill 2458452 00:08:29.198 15:44:49 json_config -- common/autotest_common.sh@972 -- # wait 2458452 00:08:31.798 15:44:51 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:31.798 15:44:51 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:08:31.798 15:44:51 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:31.798 15:44:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:31.798 15:44:51 json_config -- json_config/json_config.sh@328 -- # return 0 00:08:31.798 15:44:51 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:08:31.798 INFO: Success 00:08:31.798 00:08:31.798 real 0m28.648s 00:08:31.798 user 0m32.238s 00:08:31.798 sys 0m2.855s 00:08:31.798 15:44:51 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.798 15:44:51 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:31.798 ************************************ 00:08:31.798 END TEST json_config 00:08:31.798 ************************************ 00:08:31.798 15:44:51 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.798 15:44:51 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:31.798 15:44:51 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.798 15:44:51 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.798 15:44:51 -- common/autotest_common.sh@10 -- # set +x 00:08:31.798 ************************************ 00:08:31.798 START TEST json_config_extra_key 00:08:31.798 ************************************ 00:08:31.798 15:44:51 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:31.798 15:44:51 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:31.798 15:44:51 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:08:31.798 15:44:51 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:31.798 15:44:52 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:31.798 15:44:52 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:31.798 15:44:52 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:31.798 15:44:52 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:31.798 15:44:52 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:31.798 15:44:52 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:31.798 15:44:52 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:08:31.798 15:44:52 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:31.798 15:44:52 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:08:31.798 INFO: launching applications... 00:08:31.798 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2460382 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:31.798 Waiting for target to run... 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2460382 /var/tmp/spdk_tgt.sock 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 2460382 ']' 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:31.798 15:44:52 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:31.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:31.798 15:44:52 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:31.798 [2024-07-12 15:44:52.091118] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:31.798 [2024-07-12 15:44:52.091181] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2460382 ] 00:08:32.058 [2024-07-12 15:44:52.374335] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.058 [2024-07-12 15:44:52.423627] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.629 15:44:52 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.629 15:44:52 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:08:32.629 00:08:32.629 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:08:32.629 INFO: shutting down applications... 00:08:32.629 15:44:52 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2460382 ]] 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2460382 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2460382 00:08:32.629 15:44:52 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2460382 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@43 -- # break 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:33.198 15:44:53 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:33.198 SPDK target shutdown done 00:08:33.198 15:44:53 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:08:33.198 Success 00:08:33.198 00:08:33.198 real 0m1.501s 00:08:33.198 user 0m1.101s 00:08:33.198 sys 0m0.384s 00:08:33.198 15:44:53 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.198 15:44:53 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:33.198 ************************************ 00:08:33.198 END TEST json_config_extra_key 00:08:33.198 ************************************ 00:08:33.198 15:44:53 -- common/autotest_common.sh@1142 -- # return 0 00:08:33.198 15:44:53 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:33.198 15:44:53 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:33.198 15:44:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.198 15:44:53 -- common/autotest_common.sh@10 -- # set +x 00:08:33.198 ************************************ 00:08:33.198 START TEST alias_rpc 00:08:33.198 ************************************ 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:33.198 * Looking for test storage... 00:08:33.198 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:08:33.198 15:44:53 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:33.198 15:44:53 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2460738 00:08:33.198 15:44:53 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2460738 00:08:33.198 15:44:53 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 2460738 ']' 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:33.198 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:33.198 15:44:53 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:33.459 [2024-07-12 15:44:53.666227] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:33.459 [2024-07-12 15:44:53.666292] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2460738 ] 00:08:33.459 [2024-07-12 15:44:53.760825] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.459 [2024-07-12 15:44:53.834778] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:34.398 15:44:54 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:08:34.398 15:44:54 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2460738 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 2460738 ']' 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 2460738 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2460738 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2460738' 00:08:34.398 killing process with pid 2460738 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@967 -- # kill 2460738 00:08:34.398 15:44:54 alias_rpc -- common/autotest_common.sh@972 -- # wait 2460738 00:08:34.657 00:08:34.657 real 0m1.442s 00:08:34.657 user 0m1.623s 00:08:34.657 sys 0m0.392s 00:08:34.657 15:44:54 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.657 15:44:54 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:34.657 ************************************ 00:08:34.657 END TEST alias_rpc 00:08:34.657 ************************************ 00:08:34.657 15:44:54 -- common/autotest_common.sh@1142 -- # return 0 00:08:34.657 15:44:54 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:08:34.657 15:44:54 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:34.657 15:44:54 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:34.657 15:44:54 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.657 15:44:54 -- common/autotest_common.sh@10 -- # set +x 00:08:34.657 ************************************ 00:08:34.657 START TEST spdkcli_tcp 00:08:34.657 ************************************ 00:08:34.657 15:44:55 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:34.917 * Looking for test storage... 00:08:34.917 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:08:34.917 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:08:34.917 15:44:55 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2461089 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2461089 00:08:34.918 15:44:55 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 2461089 ']' 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.918 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:34.918 15:44:55 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:34.918 [2024-07-12 15:44:55.182634] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:34.918 [2024-07-12 15:44:55.182686] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2461089 ] 00:08:34.918 [2024-07-12 15:44:55.270542] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:34.918 [2024-07-12 15:44:55.334138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:34.918 [2024-07-12 15:44:55.334143] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.856 15:44:56 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:35.856 15:44:56 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:08:35.856 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2461118 00:08:35.856 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:08:35.856 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:08:35.856 [ 00:08:35.857 "bdev_malloc_delete", 00:08:35.857 "bdev_malloc_create", 00:08:35.857 "bdev_null_resize", 00:08:35.857 "bdev_null_delete", 00:08:35.857 "bdev_null_create", 00:08:35.857 "bdev_nvme_cuse_unregister", 00:08:35.857 "bdev_nvme_cuse_register", 00:08:35.857 "bdev_opal_new_user", 00:08:35.857 "bdev_opal_set_lock_state", 00:08:35.857 "bdev_opal_delete", 00:08:35.857 "bdev_opal_get_info", 00:08:35.857 "bdev_opal_create", 00:08:35.857 "bdev_nvme_opal_revert", 00:08:35.857 "bdev_nvme_opal_init", 00:08:35.857 "bdev_nvme_send_cmd", 00:08:35.857 "bdev_nvme_get_path_iostat", 00:08:35.857 "bdev_nvme_get_mdns_discovery_info", 00:08:35.857 "bdev_nvme_stop_mdns_discovery", 00:08:35.857 "bdev_nvme_start_mdns_discovery", 00:08:35.857 "bdev_nvme_set_multipath_policy", 00:08:35.857 "bdev_nvme_set_preferred_path", 00:08:35.857 "bdev_nvme_get_io_paths", 00:08:35.857 "bdev_nvme_remove_error_injection", 00:08:35.857 "bdev_nvme_add_error_injection", 00:08:35.857 "bdev_nvme_get_discovery_info", 00:08:35.857 "bdev_nvme_stop_discovery", 00:08:35.857 "bdev_nvme_start_discovery", 00:08:35.857 "bdev_nvme_get_controller_health_info", 00:08:35.857 "bdev_nvme_disable_controller", 00:08:35.857 "bdev_nvme_enable_controller", 00:08:35.857 "bdev_nvme_reset_controller", 00:08:35.857 "bdev_nvme_get_transport_statistics", 00:08:35.857 "bdev_nvme_apply_firmware", 00:08:35.857 "bdev_nvme_detach_controller", 00:08:35.857 "bdev_nvme_get_controllers", 00:08:35.857 "bdev_nvme_attach_controller", 00:08:35.857 "bdev_nvme_set_hotplug", 00:08:35.857 "bdev_nvme_set_options", 00:08:35.857 "bdev_passthru_delete", 00:08:35.857 "bdev_passthru_create", 00:08:35.857 "bdev_lvol_set_parent_bdev", 00:08:35.857 "bdev_lvol_set_parent", 00:08:35.857 "bdev_lvol_check_shallow_copy", 00:08:35.857 "bdev_lvol_start_shallow_copy", 00:08:35.857 "bdev_lvol_grow_lvstore", 00:08:35.857 "bdev_lvol_get_lvols", 00:08:35.857 "bdev_lvol_get_lvstores", 00:08:35.857 "bdev_lvol_delete", 00:08:35.857 "bdev_lvol_set_read_only", 00:08:35.857 "bdev_lvol_resize", 00:08:35.857 "bdev_lvol_decouple_parent", 00:08:35.857 "bdev_lvol_inflate", 00:08:35.857 "bdev_lvol_rename", 00:08:35.857 "bdev_lvol_clone_bdev", 00:08:35.857 "bdev_lvol_clone", 00:08:35.857 "bdev_lvol_snapshot", 00:08:35.857 "bdev_lvol_create", 00:08:35.857 "bdev_lvol_delete_lvstore", 00:08:35.857 "bdev_lvol_rename_lvstore", 00:08:35.857 "bdev_lvol_create_lvstore", 00:08:35.857 "bdev_raid_set_options", 00:08:35.857 "bdev_raid_remove_base_bdev", 00:08:35.857 "bdev_raid_add_base_bdev", 00:08:35.857 "bdev_raid_delete", 00:08:35.857 "bdev_raid_create", 00:08:35.857 "bdev_raid_get_bdevs", 00:08:35.857 "bdev_error_inject_error", 00:08:35.857 "bdev_error_delete", 00:08:35.857 "bdev_error_create", 00:08:35.857 "bdev_split_delete", 00:08:35.857 "bdev_split_create", 00:08:35.857 "bdev_delay_delete", 00:08:35.857 "bdev_delay_create", 00:08:35.857 "bdev_delay_update_latency", 00:08:35.857 "bdev_zone_block_delete", 00:08:35.857 "bdev_zone_block_create", 00:08:35.857 "blobfs_create", 00:08:35.857 "blobfs_detect", 00:08:35.857 "blobfs_set_cache_size", 00:08:35.857 "bdev_crypto_delete", 00:08:35.857 "bdev_crypto_create", 00:08:35.857 "bdev_compress_delete", 00:08:35.857 "bdev_compress_create", 00:08:35.857 "bdev_compress_get_orphans", 00:08:35.857 "bdev_aio_delete", 00:08:35.857 "bdev_aio_rescan", 00:08:35.857 "bdev_aio_create", 00:08:35.857 "bdev_ftl_set_property", 00:08:35.857 "bdev_ftl_get_properties", 00:08:35.857 "bdev_ftl_get_stats", 00:08:35.857 "bdev_ftl_unmap", 00:08:35.857 "bdev_ftl_unload", 00:08:35.857 "bdev_ftl_delete", 00:08:35.857 "bdev_ftl_load", 00:08:35.857 "bdev_ftl_create", 00:08:35.857 "bdev_virtio_attach_controller", 00:08:35.857 "bdev_virtio_scsi_get_devices", 00:08:35.857 "bdev_virtio_detach_controller", 00:08:35.857 "bdev_virtio_blk_set_hotplug", 00:08:35.857 "bdev_iscsi_delete", 00:08:35.857 "bdev_iscsi_create", 00:08:35.857 "bdev_iscsi_set_options", 00:08:35.857 "accel_error_inject_error", 00:08:35.857 "ioat_scan_accel_module", 00:08:35.857 "dsa_scan_accel_module", 00:08:35.857 "iaa_scan_accel_module", 00:08:35.857 "dpdk_cryptodev_get_driver", 00:08:35.857 "dpdk_cryptodev_set_driver", 00:08:35.857 "dpdk_cryptodev_scan_accel_module", 00:08:35.857 "compressdev_scan_accel_module", 00:08:35.857 "keyring_file_remove_key", 00:08:35.857 "keyring_file_add_key", 00:08:35.857 "keyring_linux_set_options", 00:08:35.857 "iscsi_get_histogram", 00:08:35.857 "iscsi_enable_histogram", 00:08:35.857 "iscsi_set_options", 00:08:35.857 "iscsi_get_auth_groups", 00:08:35.857 "iscsi_auth_group_remove_secret", 00:08:35.857 "iscsi_auth_group_add_secret", 00:08:35.857 "iscsi_delete_auth_group", 00:08:35.857 "iscsi_create_auth_group", 00:08:35.857 "iscsi_set_discovery_auth", 00:08:35.857 "iscsi_get_options", 00:08:35.857 "iscsi_target_node_request_logout", 00:08:35.857 "iscsi_target_node_set_redirect", 00:08:35.857 "iscsi_target_node_set_auth", 00:08:35.857 "iscsi_target_node_add_lun", 00:08:35.857 "iscsi_get_stats", 00:08:35.857 "iscsi_get_connections", 00:08:35.857 "iscsi_portal_group_set_auth", 00:08:35.857 "iscsi_start_portal_group", 00:08:35.857 "iscsi_delete_portal_group", 00:08:35.857 "iscsi_create_portal_group", 00:08:35.857 "iscsi_get_portal_groups", 00:08:35.857 "iscsi_delete_target_node", 00:08:35.857 "iscsi_target_node_remove_pg_ig_maps", 00:08:35.857 "iscsi_target_node_add_pg_ig_maps", 00:08:35.857 "iscsi_create_target_node", 00:08:35.857 "iscsi_get_target_nodes", 00:08:35.857 "iscsi_delete_initiator_group", 00:08:35.857 "iscsi_initiator_group_remove_initiators", 00:08:35.857 "iscsi_initiator_group_add_initiators", 00:08:35.857 "iscsi_create_initiator_group", 00:08:35.857 "iscsi_get_initiator_groups", 00:08:35.857 "nvmf_set_crdt", 00:08:35.857 "nvmf_set_config", 00:08:35.857 "nvmf_set_max_subsystems", 00:08:35.857 "nvmf_stop_mdns_prr", 00:08:35.857 "nvmf_publish_mdns_prr", 00:08:35.857 "nvmf_subsystem_get_listeners", 00:08:35.857 "nvmf_subsystem_get_qpairs", 00:08:35.857 "nvmf_subsystem_get_controllers", 00:08:35.857 "nvmf_get_stats", 00:08:35.857 "nvmf_get_transports", 00:08:35.857 "nvmf_create_transport", 00:08:35.857 "nvmf_get_targets", 00:08:35.857 "nvmf_delete_target", 00:08:35.857 "nvmf_create_target", 00:08:35.857 "nvmf_subsystem_allow_any_host", 00:08:35.857 "nvmf_subsystem_remove_host", 00:08:35.857 "nvmf_subsystem_add_host", 00:08:35.857 "nvmf_ns_remove_host", 00:08:35.857 "nvmf_ns_add_host", 00:08:35.857 "nvmf_subsystem_remove_ns", 00:08:35.857 "nvmf_subsystem_add_ns", 00:08:35.857 "nvmf_subsystem_listener_set_ana_state", 00:08:35.857 "nvmf_discovery_get_referrals", 00:08:35.857 "nvmf_discovery_remove_referral", 00:08:35.857 "nvmf_discovery_add_referral", 00:08:35.857 "nvmf_subsystem_remove_listener", 00:08:35.857 "nvmf_subsystem_add_listener", 00:08:35.857 "nvmf_delete_subsystem", 00:08:35.857 "nvmf_create_subsystem", 00:08:35.857 "nvmf_get_subsystems", 00:08:35.857 "env_dpdk_get_mem_stats", 00:08:35.857 "nbd_get_disks", 00:08:35.857 "nbd_stop_disk", 00:08:35.857 "nbd_start_disk", 00:08:35.857 "ublk_recover_disk", 00:08:35.857 "ublk_get_disks", 00:08:35.857 "ublk_stop_disk", 00:08:35.857 "ublk_start_disk", 00:08:35.857 "ublk_destroy_target", 00:08:35.857 "ublk_create_target", 00:08:35.857 "virtio_blk_create_transport", 00:08:35.857 "virtio_blk_get_transports", 00:08:35.857 "vhost_controller_set_coalescing", 00:08:35.857 "vhost_get_controllers", 00:08:35.857 "vhost_delete_controller", 00:08:35.857 "vhost_create_blk_controller", 00:08:35.857 "vhost_scsi_controller_remove_target", 00:08:35.857 "vhost_scsi_controller_add_target", 00:08:35.857 "vhost_start_scsi_controller", 00:08:35.857 "vhost_create_scsi_controller", 00:08:35.857 "thread_set_cpumask", 00:08:35.857 "framework_get_governor", 00:08:35.857 "framework_get_scheduler", 00:08:35.857 "framework_set_scheduler", 00:08:35.857 "framework_get_reactors", 00:08:35.857 "thread_get_io_channels", 00:08:35.857 "thread_get_pollers", 00:08:35.857 "thread_get_stats", 00:08:35.857 "framework_monitor_context_switch", 00:08:35.857 "spdk_kill_instance", 00:08:35.857 "log_enable_timestamps", 00:08:35.857 "log_get_flags", 00:08:35.857 "log_clear_flag", 00:08:35.857 "log_set_flag", 00:08:35.857 "log_get_level", 00:08:35.857 "log_set_level", 00:08:35.857 "log_get_print_level", 00:08:35.857 "log_set_print_level", 00:08:35.857 "framework_enable_cpumask_locks", 00:08:35.857 "framework_disable_cpumask_locks", 00:08:35.857 "framework_wait_init", 00:08:35.857 "framework_start_init", 00:08:35.857 "scsi_get_devices", 00:08:35.857 "bdev_get_histogram", 00:08:35.857 "bdev_enable_histogram", 00:08:35.857 "bdev_set_qos_limit", 00:08:35.857 "bdev_set_qd_sampling_period", 00:08:35.857 "bdev_get_bdevs", 00:08:35.857 "bdev_reset_iostat", 00:08:35.857 "bdev_get_iostat", 00:08:35.857 "bdev_examine", 00:08:35.857 "bdev_wait_for_examine", 00:08:35.857 "bdev_set_options", 00:08:35.857 "notify_get_notifications", 00:08:35.857 "notify_get_types", 00:08:35.857 "accel_get_stats", 00:08:35.857 "accel_set_options", 00:08:35.857 "accel_set_driver", 00:08:35.857 "accel_crypto_key_destroy", 00:08:35.857 "accel_crypto_keys_get", 00:08:35.857 "accel_crypto_key_create", 00:08:35.857 "accel_assign_opc", 00:08:35.857 "accel_get_module_info", 00:08:35.857 "accel_get_opc_assignments", 00:08:35.857 "vmd_rescan", 00:08:35.857 "vmd_remove_device", 00:08:35.857 "vmd_enable", 00:08:35.857 "sock_get_default_impl", 00:08:35.857 "sock_set_default_impl", 00:08:35.857 "sock_impl_set_options", 00:08:35.857 "sock_impl_get_options", 00:08:35.857 "iobuf_get_stats", 00:08:35.858 "iobuf_set_options", 00:08:35.858 "framework_get_pci_devices", 00:08:35.858 "framework_get_config", 00:08:35.858 "framework_get_subsystems", 00:08:35.858 "trace_get_info", 00:08:35.858 "trace_get_tpoint_group_mask", 00:08:35.858 "trace_disable_tpoint_group", 00:08:35.858 "trace_enable_tpoint_group", 00:08:35.858 "trace_clear_tpoint_mask", 00:08:35.858 "trace_set_tpoint_mask", 00:08:35.858 "keyring_get_keys", 00:08:35.858 "spdk_get_version", 00:08:35.858 "rpc_get_methods" 00:08:35.858 ] 00:08:35.858 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:35.858 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:08:35.858 15:44:56 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2461089 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 2461089 ']' 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 2461089 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2461089 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2461089' 00:08:35.858 killing process with pid 2461089 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 2461089 00:08:35.858 15:44:56 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 2461089 00:08:36.118 00:08:36.118 real 0m1.489s 00:08:36.118 user 0m2.779s 00:08:36.118 sys 0m0.459s 00:08:36.118 15:44:56 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.118 15:44:56 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:36.118 ************************************ 00:08:36.118 END TEST spdkcli_tcp 00:08:36.118 ************************************ 00:08:36.118 15:44:56 -- common/autotest_common.sh@1142 -- # return 0 00:08:36.118 15:44:56 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:36.118 15:44:56 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:36.118 15:44:56 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.118 15:44:56 -- common/autotest_common.sh@10 -- # set +x 00:08:36.378 ************************************ 00:08:36.378 START TEST dpdk_mem_utility 00:08:36.378 ************************************ 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:36.378 * Looking for test storage... 00:08:36.378 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:08:36.378 15:44:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:36.378 15:44:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2461419 00:08:36.378 15:44:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2461419 00:08:36.378 15:44:56 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 2461419 ']' 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:36.378 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:36.378 15:44:56 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:36.378 [2024-07-12 15:44:56.744505] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:36.378 [2024-07-12 15:44:56.744574] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2461419 ] 00:08:36.638 [2024-07-12 15:44:56.833984] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.638 [2024-07-12 15:44:56.901573] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.208 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:37.208 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:08:37.208 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:08:37.208 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:08:37.208 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:37.208 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:37.208 { 00:08:37.208 "filename": "/tmp/spdk_mem_dump.txt" 00:08:37.208 } 00:08:37.208 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:37.208 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:37.471 DPDK memory size 816.000000 MiB in 2 heap(s) 00:08:37.471 2 heaps totaling size 816.000000 MiB 00:08:37.471 size: 814.000000 MiB heap id: 0 00:08:37.471 size: 2.000000 MiB heap id: 1 00:08:37.471 end heaps---------- 00:08:37.471 8 mempools totaling size 598.116089 MiB 00:08:37.471 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:08:37.471 size: 158.602051 MiB name: PDU_data_out_Pool 00:08:37.471 size: 84.521057 MiB name: bdev_io_2461419 00:08:37.471 size: 51.011292 MiB name: evtpool_2461419 00:08:37.471 size: 50.003479 MiB name: msgpool_2461419 00:08:37.471 size: 21.763794 MiB name: PDU_Pool 00:08:37.471 size: 19.513306 MiB name: SCSI_TASK_Pool 00:08:37.471 size: 0.026123 MiB name: Session_Pool 00:08:37.471 end mempools------- 00:08:37.471 201 memzones totaling size 4.176453 MiB 00:08:37.471 size: 1.000366 MiB name: RG_ring_0_2461419 00:08:37.471 size: 1.000366 MiB name: RG_ring_1_2461419 00:08:37.471 size: 1.000366 MiB name: RG_ring_4_2461419 00:08:37.471 size: 1.000366 MiB name: RG_ring_5_2461419 00:08:37.471 size: 0.125366 MiB name: RG_ring_2_2461419 00:08:37.471 size: 0.015991 MiB name: RG_ring_3_2461419 00:08:37.471 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:01.7_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:cc:02.7_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:01.7_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:ce:02.7_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:01.7_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.0_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.1_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.2_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.3_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.4_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.5_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.6_qat 00:08:37.471 size: 0.000305 MiB name: 0000:d0:02.7_qat 00:08:37.471 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_0 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_1 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_2 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_3 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_4 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_5 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_6 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_7 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_8 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_9 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_10 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_11 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_12 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_13 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_14 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_15 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_16 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_17 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_18 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_19 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_20 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_21 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_22 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_23 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_24 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_25 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_26 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_27 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_28 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_29 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_30 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_31 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_32 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_33 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_34 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_35 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_36 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_37 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_38 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_39 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_40 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_41 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_42 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:37.471 size: 0.000122 MiB name: rte_compressdev_data_43 00:08:37.471 size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:37.472 size: 0.000122 MiB name: rte_compressdev_data_44 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:37.472 size: 0.000122 MiB name: rte_compressdev_data_45 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:37.472 size: 0.000122 MiB name: rte_compressdev_data_46 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:37.472 size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:37.472 size: 0.000122 MiB name: rte_compressdev_data_47 00:08:37.472 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:37.472 end memzones------- 00:08:37.472 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:08:37.472 heap id: 0 total size: 814.000000 MiB number of busy elements: 496 number of free elements: 14 00:08:37.472 list of free elements. size: 11.842346 MiB 00:08:37.472 element at address: 0x200000400000 with size: 1.999512 MiB 00:08:37.472 element at address: 0x200018e00000 with size: 0.999878 MiB 00:08:37.472 element at address: 0x200019000000 with size: 0.999878 MiB 00:08:37.472 element at address: 0x200003e00000 with size: 0.996460 MiB 00:08:37.472 element at address: 0x200031c00000 with size: 0.994446 MiB 00:08:37.472 element at address: 0x200007000000 with size: 0.991760 MiB 00:08:37.472 element at address: 0x200013800000 with size: 0.978882 MiB 00:08:37.472 element at address: 0x200019200000 with size: 0.937256 MiB 00:08:37.472 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:08:37.472 element at address: 0x200003a00000 with size: 0.498535 MiB 00:08:37.472 element at address: 0x20000b200000 with size: 0.491272 MiB 00:08:37.472 element at address: 0x200000800000 with size: 0.486145 MiB 00:08:37.472 element at address: 0x200019400000 with size: 0.485840 MiB 00:08:37.472 element at address: 0x200027e00000 with size: 0.399231 MiB 00:08:37.472 list of standard malloc elements. size: 199.872803 MiB 00:08:37.472 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:08:37.472 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:08:37.472 element at address: 0x200018efff80 with size: 1.000122 MiB 00:08:37.472 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:08:37.472 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:08:37.472 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:08:37.472 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:08:37.472 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:08:37.472 element at address: 0x20000033b340 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000033e8c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000341e40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003453c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000348940 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000034bec0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000034f440 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003529c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000355f40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003594c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000035ca40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000035ffc0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000363540 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000366ac0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000036a040 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000036d5c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000370b40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003740c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000377640 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000037abc0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000037e140 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003816c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000384c40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003881c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000038b740 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000038ecc0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000392240 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003957c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000398d40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000039c2c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x20000039f840 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003a2dc0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003a6340 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003a98c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003ace40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003b03c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003b3940 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003b6ec0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003ba440 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003bd9c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003c0f40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003c44c0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003c7a40 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003cafc0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003ce540 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003d1ac0 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003d5040 with size: 0.004395 MiB 00:08:37.472 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:08:37.472 element at address: 0x200000339240 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000033a2c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000033c7c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000033d840 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000033fd40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000340dc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003432c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000344340 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000346840 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003478c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000349dc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000034ae40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000034d340 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000034e3c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003508c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000351940 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000353e40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000354ec0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003573c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000358440 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000035a940 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000035b9c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000035dec0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000035ef40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000361440 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003624c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003649c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000365a40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000367f40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000368fc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000036b4c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000036c540 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000036ea40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000036fac0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000371fc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000373040 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000375540 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003765c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000378ac0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000379b40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000037c040 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000037d0c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000037f5c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000380640 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000382b40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000383bc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003860c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000387140 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000389640 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000038a6c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000038cbc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000038dc40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000390140 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003936c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000394740 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000396c40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x200000397cc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000039a1c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000039b240 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000039d740 with size: 0.004028 MiB 00:08:37.472 element at address: 0x20000039e7c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a0cc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a1d40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a52c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a77c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003a8840 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003aad40 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003abdc0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003ae2c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003af340 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003b1840 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003b28c0 with size: 0.004028 MiB 00:08:37.472 element at address: 0x2000003b4dc0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003b5e40 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003b8340 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003b93c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003bb8c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003bc940 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003bee40 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003bfec0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c23c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c3440 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c5940 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c69c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c8ec0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003c9f40 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003cc440 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003cd4c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003cf9c0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003d0a40 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003d2f40 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003d3fc0 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:08:37.473 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:08:37.473 element at address: 0x200000200000 with size: 0.000305 MiB 00:08:37.473 element at address: 0x20000020ea00 with size: 0.000305 MiB 00:08:37.473 element at address: 0x200000200140 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200200 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200380 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200440 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200500 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200680 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200740 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200800 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200980 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200a40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200b00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200c80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000200d40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209000 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002090c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209180 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209240 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209300 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002093c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209480 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209540 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209600 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002096c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209780 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209840 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209900 with size: 0.000183 MiB 00:08:37.473 element at address: 0x2000002099c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209a80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209b40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209c00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209cc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209d80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209e40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209f00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x200000209fc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a080 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a140 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a200 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a2c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a380 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a440 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a500 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a5c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a680 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a740 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a800 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a8c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020a980 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020aa40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ab00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020abc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ac80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ad40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ae00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020aec0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020af80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b040 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b100 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b1c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b280 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b340 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b400 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b4c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b580 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b640 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b700 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b7c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b880 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020b940 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ba00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bac0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bb80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bc40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bd00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bdc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020be80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020bf40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c000 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c0c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c180 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c240 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c300 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c3c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c480 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c540 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c600 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c6c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c780 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c840 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c900 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020c9c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ca80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020cb40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020cc00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ccc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020cd80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ce40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020cf00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020cfc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d080 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d140 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d200 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d2c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d380 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d440 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d500 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d5c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d680 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d740 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d800 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d8c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020d980 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020da40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020db00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020dbc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020dc80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020dd40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020de00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020dec0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020df80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e040 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e100 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e1c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e280 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e340 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e400 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e4c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e580 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e640 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e700 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e7c0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e880 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020e940 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020eb40 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ec00 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ecc0 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ed80 with size: 0.000183 MiB 00:08:37.473 element at address: 0x20000020ee40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020ef00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020efc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f080 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f140 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f200 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f2c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f380 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f440 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f500 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f5c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f680 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f740 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f800 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f8c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020f980 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fa40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fb00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fbc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fc80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fd40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fe00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020fec0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000020ff80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210040 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210100 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002101c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210280 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210340 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210400 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002104c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210580 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210640 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210700 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002107c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210880 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210940 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210a00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210ac0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000210cc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000214f80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235240 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235300 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002353c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235480 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235540 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235600 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002356c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235780 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235840 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235900 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002359c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235a80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235b40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235c00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235cc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235d80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235e40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000235f00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236100 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002361c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236280 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236340 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236400 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002364c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236580 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236640 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236700 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000002367c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236880 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236940 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236a00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236ac0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236b80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236c40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000236d00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000338f00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000338fc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000033c540 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000033fac0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000343040 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003465c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000349b40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000034d0c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000350640 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000353bc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000357140 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000035a6c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000035dc40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003611c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000364740 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000367cc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000036b240 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000036e7c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000371d40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003752c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000378840 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000037bdc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000037f340 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003828c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000385e40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003893c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000038c940 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000038fec0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000393440 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003969c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200000399f40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000039d4c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003a0a40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003a3fc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003a7540 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003aaac0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003ae040 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003b4b40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003b80c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003bb640 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003bebc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003c2140 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003c56c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003c8c40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003cc1c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003cf740 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003d2cc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087c740 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087c800 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087c980 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e66340 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e66400 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d000 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:08:37.474 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:08:37.475 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:08:37.475 list of memzone associated elements. size: 602.284851 MiB 00:08:37.475 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:08:37.475 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:08:37.475 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:08:37.475 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:08:37.475 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:08:37.475 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2461419_0 00:08:37.475 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:08:37.475 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2461419_0 00:08:37.476 element at address: 0x200003fff380 with size: 48.003052 MiB 00:08:37.476 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2461419_0 00:08:37.476 element at address: 0x2000195be940 with size: 20.255554 MiB 00:08:37.476 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:08:37.476 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:08:37.476 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:08:37.476 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:08:37.476 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2461419 00:08:37.476 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:08:37.476 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2461419 00:08:37.476 element at address: 0x200000236dc0 with size: 1.008118 MiB 00:08:37.476 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2461419 00:08:37.476 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:08:37.476 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:08:37.476 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:08:37.476 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:08:37.476 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:08:37.476 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:08:37.476 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:08:37.476 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:08:37.476 element at address: 0x200003eff180 with size: 1.000488 MiB 00:08:37.476 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2461419 00:08:37.476 element at address: 0x200003affc00 with size: 1.000488 MiB 00:08:37.476 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2461419 00:08:37.476 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:08:37.476 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2461419 00:08:37.476 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:08:37.476 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2461419 00:08:37.476 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:08:37.476 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2461419 00:08:37.476 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:08:37.476 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:08:37.476 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:08:37.476 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:08:37.476 element at address: 0x20001947c600 with size: 0.250488 MiB 00:08:37.476 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:08:37.476 element at address: 0x200000215040 with size: 0.125488 MiB 00:08:37.476 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2461419 00:08:37.476 element at address: 0x200000200e00 with size: 0.031738 MiB 00:08:37.476 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:08:37.476 element at address: 0x200027e664c0 with size: 0.023743 MiB 00:08:37.476 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:08:37.476 element at address: 0x200000210d80 with size: 0.016113 MiB 00:08:37.476 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2461419 00:08:37.476 element at address: 0x200027e6c600 with size: 0.002441 MiB 00:08:37.476 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:08:37.476 element at address: 0x2000003d6300 with size: 0.001282 MiB 00:08:37.476 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:37.476 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.0_qat 00:08:37.476 element at address: 0x2000003d2d80 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.1_qat 00:08:37.476 element at address: 0x2000003cf800 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.2_qat 00:08:37.476 element at address: 0x2000003cc280 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.3_qat 00:08:37.476 element at address: 0x2000003c8d00 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.4_qat 00:08:37.476 element at address: 0x2000003c5780 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.5_qat 00:08:37.476 element at address: 0x2000003c2200 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.6_qat 00:08:37.476 element at address: 0x2000003bec80 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.7_qat 00:08:37.476 element at address: 0x2000003bb700 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.0_qat 00:08:37.476 element at address: 0x2000003b8180 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.1_qat 00:08:37.476 element at address: 0x2000003b4c00 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.2_qat 00:08:37.476 element at address: 0x2000003b1680 with size: 0.000427 MiB 00:08:37.476 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.3_qat 00:08:37.476 element at address: 0x2000003ae100 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.4_qat 00:08:37.477 element at address: 0x2000003aab80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.5_qat 00:08:37.477 element at address: 0x2000003a7600 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.6_qat 00:08:37.477 element at address: 0x2000003a4080 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.7_qat 00:08:37.477 element at address: 0x2000003a0b00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.0_qat 00:08:37.477 element at address: 0x20000039d580 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.1_qat 00:08:37.477 element at address: 0x20000039a000 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.2_qat 00:08:37.477 element at address: 0x200000396a80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.3_qat 00:08:37.477 element at address: 0x200000393500 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.4_qat 00:08:37.477 element at address: 0x20000038ff80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.5_qat 00:08:37.477 element at address: 0x20000038ca00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.6_qat 00:08:37.477 element at address: 0x200000389480 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.7_qat 00:08:37.477 element at address: 0x200000385f00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.0_qat 00:08:37.477 element at address: 0x200000382980 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.1_qat 00:08:37.477 element at address: 0x20000037f400 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.2_qat 00:08:37.477 element at address: 0x20000037be80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.3_qat 00:08:37.477 element at address: 0x200000378900 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.4_qat 00:08:37.477 element at address: 0x200000375380 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.5_qat 00:08:37.477 element at address: 0x200000371e00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.6_qat 00:08:37.477 element at address: 0x20000036e880 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.7_qat 00:08:37.477 element at address: 0x20000036b300 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.0_qat 00:08:37.477 element at address: 0x200000367d80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.1_qat 00:08:37.477 element at address: 0x200000364800 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.2_qat 00:08:37.477 element at address: 0x200000361280 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.3_qat 00:08:37.477 element at address: 0x20000035dd00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.4_qat 00:08:37.477 element at address: 0x20000035a780 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.5_qat 00:08:37.477 element at address: 0x200000357200 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.6_qat 00:08:37.477 element at address: 0x200000353c80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.7_qat 00:08:37.477 element at address: 0x200000350700 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.0_qat 00:08:37.477 element at address: 0x20000034d180 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.1_qat 00:08:37.477 element at address: 0x200000349c00 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.2_qat 00:08:37.477 element at address: 0x200000346680 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.3_qat 00:08:37.477 element at address: 0x200000343100 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.4_qat 00:08:37.477 element at address: 0x20000033fb80 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.5_qat 00:08:37.477 element at address: 0x20000033c600 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.6_qat 00:08:37.477 element at address: 0x200000339080 with size: 0.000427 MiB 00:08:37.477 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.7_qat 00:08:37.477 element at address: 0x2000003d6900 with size: 0.000305 MiB 00:08:37.477 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:37.477 element at address: 0x200000235fc0 with size: 0.000305 MiB 00:08:37.477 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2461419 00:08:37.477 element at address: 0x200000210b80 with size: 0.000305 MiB 00:08:37.477 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2461419 00:08:37.477 element at address: 0x200027e6d0c0 with size: 0.000305 MiB 00:08:37.477 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:08:37.477 element at address: 0x2000003d6240 with size: 0.000183 MiB 00:08:37.477 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:37.477 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:08:37.477 15:44:57 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2461419 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 2461419 ']' 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 2461419 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2461419 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2461419' 00:08:37.477 killing process with pid 2461419 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 2461419 00:08:37.477 15:44:57 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 2461419 00:08:37.738 00:08:37.738 real 0m1.440s 00:08:37.738 user 0m1.627s 00:08:37.738 sys 0m0.407s 00:08:37.738 15:44:58 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.738 15:44:58 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:37.738 ************************************ 00:08:37.738 END TEST dpdk_mem_utility 00:08:37.738 ************************************ 00:08:37.738 15:44:58 -- common/autotest_common.sh@1142 -- # return 0 00:08:37.738 15:44:58 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:37.738 15:44:58 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:37.738 15:44:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.738 15:44:58 -- common/autotest_common.sh@10 -- # set +x 00:08:37.738 ************************************ 00:08:37.738 START TEST event 00:08:37.738 ************************************ 00:08:37.738 15:44:58 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:37.998 * Looking for test storage... 00:08:37.998 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:08:37.998 15:44:58 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:37.998 15:44:58 event -- bdev/nbd_common.sh@6 -- # set -e 00:08:37.998 15:44:58 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:37.998 15:44:58 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:37.998 15:44:58 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.998 15:44:58 event -- common/autotest_common.sh@10 -- # set +x 00:08:37.998 ************************************ 00:08:37.998 START TEST event_perf 00:08:37.998 ************************************ 00:08:37.998 15:44:58 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:37.998 Running I/O for 1 seconds...[2024-07-12 15:44:58.272203] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:37.998 [2024-07-12 15:44:58.272295] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2461649 ] 00:08:37.998 [2024-07-12 15:44:58.366836] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:37.998 [2024-07-12 15:44:58.445097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:37.998 [2024-07-12 15:44:58.445242] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:37.998 [2024-07-12 15:44:58.445386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.998 Running I/O for 1 seconds...[2024-07-12 15:44:58.445387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:39.385 00:08:39.385 lcore 0: 75637 00:08:39.385 lcore 1: 75641 00:08:39.385 lcore 2: 75645 00:08:39.385 lcore 3: 75641 00:08:39.385 done. 00:08:39.385 00:08:39.385 real 0m1.251s 00:08:39.385 user 0m4.140s 00:08:39.385 sys 0m0.105s 00:08:39.385 15:44:59 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:39.385 15:44:59 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:08:39.385 ************************************ 00:08:39.385 END TEST event_perf 00:08:39.385 ************************************ 00:08:39.385 15:44:59 event -- common/autotest_common.sh@1142 -- # return 0 00:08:39.385 15:44:59 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:39.385 15:44:59 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:39.385 15:44:59 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:39.385 15:44:59 event -- common/autotest_common.sh@10 -- # set +x 00:08:39.385 ************************************ 00:08:39.385 START TEST event_reactor 00:08:39.385 ************************************ 00:08:39.385 15:44:59 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:39.385 [2024-07-12 15:44:59.597734] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:39.385 [2024-07-12 15:44:59.597820] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2461872 ] 00:08:39.385 [2024-07-12 15:44:59.688377] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.385 [2024-07-12 15:44:59.761735] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.771 test_start 00:08:40.771 oneshot 00:08:40.771 tick 100 00:08:40.771 tick 100 00:08:40.771 tick 250 00:08:40.771 tick 100 00:08:40.771 tick 100 00:08:40.771 tick 100 00:08:40.771 tick 250 00:08:40.771 tick 500 00:08:40.771 tick 100 00:08:40.771 tick 100 00:08:40.771 tick 250 00:08:40.771 tick 100 00:08:40.771 tick 100 00:08:40.771 test_end 00:08:40.771 00:08:40.771 real 0m1.243s 00:08:40.771 user 0m1.140s 00:08:40.771 sys 0m0.098s 00:08:40.771 15:45:00 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:40.771 15:45:00 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:08:40.771 ************************************ 00:08:40.771 END TEST event_reactor 00:08:40.771 ************************************ 00:08:40.771 15:45:00 event -- common/autotest_common.sh@1142 -- # return 0 00:08:40.771 15:45:00 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:40.771 15:45:00 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:40.771 15:45:00 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:40.771 15:45:00 event -- common/autotest_common.sh@10 -- # set +x 00:08:40.771 ************************************ 00:08:40.771 START TEST event_reactor_perf 00:08:40.771 ************************************ 00:08:40.771 15:45:00 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:40.771 [2024-07-12 15:45:00.919707] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:40.771 [2024-07-12 15:45:00.919782] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2462237 ] 00:08:40.771 [2024-07-12 15:45:01.010805] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.771 [2024-07-12 15:45:01.085725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.713 test_start 00:08:41.713 test_end 00:08:41.713 Performance: 401571 events per second 00:08:41.713 00:08:41.713 real 0m1.246s 00:08:41.713 user 0m1.143s 00:08:41.713 sys 0m0.097s 00:08:41.713 15:45:02 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:41.713 15:45:02 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:08:41.713 ************************************ 00:08:41.713 END TEST event_reactor_perf 00:08:41.713 ************************************ 00:08:41.974 15:45:02 event -- common/autotest_common.sh@1142 -- # return 0 00:08:41.974 15:45:02 event -- event/event.sh@49 -- # uname -s 00:08:41.974 15:45:02 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:08:41.974 15:45:02 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:41.974 15:45:02 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:41.974 15:45:02 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:41.974 15:45:02 event -- common/autotest_common.sh@10 -- # set +x 00:08:41.974 ************************************ 00:08:41.974 START TEST event_scheduler 00:08:41.974 ************************************ 00:08:41.974 15:45:02 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:41.974 * Looking for test storage... 00:08:41.974 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:08:41.974 15:45:02 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:08:41.974 15:45:02 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2462658 00:08:41.974 15:45:02 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:08:41.974 15:45:02 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:08:41.974 15:45:02 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2462658 00:08:41.974 15:45:02 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 2462658 ']' 00:08:41.974 15:45:02 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:41.974 15:45:02 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:41.974 15:45:02 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:41.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:41.975 15:45:02 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:41.975 15:45:02 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:41.975 [2024-07-12 15:45:02.391430] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:41.975 [2024-07-12 15:45:02.391491] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2462658 ] 00:08:42.235 [2024-07-12 15:45:02.544118] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:42.496 [2024-07-12 15:45:02.714419] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.496 [2024-07-12 15:45:02.714595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:42.496 [2024-07-12 15:45:02.714821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:42.496 [2024-07-12 15:45:02.715006] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:08:43.066 15:45:03 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 [2024-07-12 15:45:03.245793] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:08:43.066 [2024-07-12 15:45:03.245841] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:08:43.066 [2024-07-12 15:45:03.245866] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:08:43.066 [2024-07-12 15:45:03.245883] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:08:43.066 [2024-07-12 15:45:03.245899] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 [2024-07-12 15:45:03.360560] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 ************************************ 00:08:43.066 START TEST scheduler_create_thread 00:08:43.066 ************************************ 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 2 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 3 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 4 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 5 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 6 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 7 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 8 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 9 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:43.066 10 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:43.066 15:45:03 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.010 15:45:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:44.010 15:45:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:08:44.010 15:45:04 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:08:44.010 15:45:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:44.010 15:45:04 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:44.949 15:45:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:44.949 15:45:05 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:08:44.949 15:45:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:44.949 15:45:05 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:45.888 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:45.888 15:45:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:08:45.888 15:45:06 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:08:45.888 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:45.888 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:46.827 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:46.827 00:08:46.827 real 0m3.565s 00:08:46.827 user 0m0.025s 00:08:46.827 sys 0m0.004s 00:08:46.827 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:46.827 15:45:06 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:46.827 ************************************ 00:08:46.827 END TEST scheduler_create_thread 00:08:46.827 ************************************ 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:08:46.827 15:45:07 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:08:46.827 15:45:07 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2462658 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 2462658 ']' 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 2462658 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2462658 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2462658' 00:08:46.827 killing process with pid 2462658 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 2462658 00:08:46.827 15:45:07 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 2462658 00:08:47.088 [2024-07-12 15:45:07.347606] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:08:47.349 00:08:47.349 real 0m5.456s 00:08:47.349 user 0m10.682s 00:08:47.349 sys 0m0.496s 00:08:47.349 15:45:07 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:47.349 15:45:07 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:47.349 ************************************ 00:08:47.349 END TEST event_scheduler 00:08:47.349 ************************************ 00:08:47.349 15:45:07 event -- common/autotest_common.sh@1142 -- # return 0 00:08:47.349 15:45:07 event -- event/event.sh@51 -- # modprobe -n nbd 00:08:47.349 15:45:07 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:08:47.349 15:45:07 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:47.349 15:45:07 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:47.349 15:45:07 event -- common/autotest_common.sh@10 -- # set +x 00:08:47.349 ************************************ 00:08:47.349 START TEST app_repeat 00:08:47.349 ************************************ 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2463622 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2463622' 00:08:47.349 Process app_repeat pid: 2463622 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:08:47.349 spdk_app_start Round 0 00:08:47.349 15:45:07 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2463622 /var/tmp/spdk-nbd.sock 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2463622 ']' 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:47.349 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:47.349 15:45:07 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:47.610 [2024-07-12 15:45:07.807277] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:08:47.610 [2024-07-12 15:45:07.807344] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2463622 ] 00:08:47.610 [2024-07-12 15:45:07.900843] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:47.610 [2024-07-12 15:45:07.974562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:47.610 [2024-07-12 15:45:07.974566] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.549 15:45:08 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:48.549 15:45:08 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:48.549 15:45:08 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:48.549 Malloc0 00:08:48.549 15:45:08 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:48.549 Malloc1 00:08:48.810 15:45:09 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:48.810 /dev/nbd0 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:48.810 1+0 records in 00:08:48.810 1+0 records out 00:08:48.810 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284892 s, 14.4 MB/s 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:48.810 15:45:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:48.810 15:45:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:49.070 /dev/nbd1 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:49.070 1+0 records in 00:08:49.070 1+0 records out 00:08:49.070 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308028 s, 13.3 MB/s 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:49.070 15:45:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.070 15:45:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:49.330 { 00:08:49.330 "nbd_device": "/dev/nbd0", 00:08:49.330 "bdev_name": "Malloc0" 00:08:49.330 }, 00:08:49.330 { 00:08:49.330 "nbd_device": "/dev/nbd1", 00:08:49.330 "bdev_name": "Malloc1" 00:08:49.330 } 00:08:49.330 ]' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:49.330 { 00:08:49.330 "nbd_device": "/dev/nbd0", 00:08:49.330 "bdev_name": "Malloc0" 00:08:49.330 }, 00:08:49.330 { 00:08:49.330 "nbd_device": "/dev/nbd1", 00:08:49.330 "bdev_name": "Malloc1" 00:08:49.330 } 00:08:49.330 ]' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:49.330 /dev/nbd1' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:49.330 /dev/nbd1' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:49.330 256+0 records in 00:08:49.330 256+0 records out 00:08:49.330 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0121418 s, 86.4 MB/s 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:49.330 256+0 records in 00:08:49.330 256+0 records out 00:08:49.330 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0149621 s, 70.1 MB/s 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:49.330 256+0 records in 00:08:49.330 256+0 records out 00:08:49.330 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0160096 s, 65.5 MB/s 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:49.330 15:45:09 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:49.591 15:45:09 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:49.851 15:45:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:50.111 15:45:10 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:50.111 15:45:10 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:50.370 15:45:10 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:50.370 [2024-07-12 15:45:10.740366] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:50.370 [2024-07-12 15:45:10.802514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:50.370 [2024-07-12 15:45:10.802518] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.631 [2024-07-12 15:45:10.833788] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:50.631 [2024-07-12 15:45:10.833819] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:53.924 15:45:13 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:53.924 15:45:13 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:53.924 spdk_app_start Round 1 00:08:53.924 15:45:13 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2463622 /var/tmp/spdk-nbd.sock 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2463622 ']' 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:53.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:53.924 15:45:13 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:53.924 15:45:13 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:53.924 Malloc0 00:08:53.924 15:45:13 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:53.925 Malloc1 00:08:53.925 15:45:14 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:53.925 /dev/nbd0 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:53.925 1+0 records in 00:08:53.925 1+0 records out 00:08:53.925 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271869 s, 15.1 MB/s 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:53.925 15:45:14 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:53.925 15:45:14 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:54.301 /dev/nbd1 00:08:54.301 15:45:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:54.301 15:45:14 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:54.301 15:45:14 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:54.302 1+0 records in 00:08:54.302 1+0 records out 00:08:54.302 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265336 s, 15.4 MB/s 00:08:54.302 15:45:14 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:54.302 15:45:14 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:54.302 15:45:14 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:54.302 15:45:14 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:54.302 15:45:14 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:54.302 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:54.302 15:45:14 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:54.302 15:45:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:54.302 15:45:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:54.302 15:45:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:54.563 { 00:08:54.563 "nbd_device": "/dev/nbd0", 00:08:54.563 "bdev_name": "Malloc0" 00:08:54.563 }, 00:08:54.563 { 00:08:54.563 "nbd_device": "/dev/nbd1", 00:08:54.563 "bdev_name": "Malloc1" 00:08:54.563 } 00:08:54.563 ]' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:54.563 { 00:08:54.563 "nbd_device": "/dev/nbd0", 00:08:54.563 "bdev_name": "Malloc0" 00:08:54.563 }, 00:08:54.563 { 00:08:54.563 "nbd_device": "/dev/nbd1", 00:08:54.563 "bdev_name": "Malloc1" 00:08:54.563 } 00:08:54.563 ]' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:54.563 /dev/nbd1' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:54.563 /dev/nbd1' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:54.563 256+0 records in 00:08:54.563 256+0 records out 00:08:54.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125274 s, 83.7 MB/s 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:54.563 256+0 records in 00:08:54.563 256+0 records out 00:08:54.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0149383 s, 70.2 MB/s 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:54.563 256+0 records in 00:08:54.563 256+0 records out 00:08:54.563 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0160494 s, 65.3 MB/s 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.563 15:45:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:54.823 15:45:15 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.083 15:45:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:55.343 15:45:15 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:55.343 15:45:15 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:55.602 15:45:15 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:55.602 [2024-07-12 15:45:15.942284] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:55.602 [2024-07-12 15:45:16.003831] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:55.602 [2024-07-12 15:45:16.003835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.602 [2024-07-12 15:45:16.035434] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:55.602 [2024-07-12 15:45:16.035469] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:58.897 15:45:18 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:58.897 15:45:18 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:58.897 spdk_app_start Round 2 00:08:58.897 15:45:18 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2463622 /var/tmp/spdk-nbd.sock 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2463622 ']' 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:58.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:58.897 15:45:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:58.897 15:45:19 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:58.897 15:45:19 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:58.897 15:45:19 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:58.897 Malloc0 00:08:58.897 15:45:19 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:59.157 Malloc1 00:08:59.157 15:45:19 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:59.157 /dev/nbd0 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:59.157 1+0 records in 00:08:59.157 1+0 records out 00:08:59.157 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263559 s, 15.5 MB/s 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.157 15:45:19 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:59.157 15:45:19 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:59.416 /dev/nbd1 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:59.416 1+0 records in 00:08:59.416 1+0 records out 00:08:59.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000164823 s, 24.9 MB/s 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:59.416 15:45:19 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:59.416 15:45:19 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:59.676 { 00:08:59.676 "nbd_device": "/dev/nbd0", 00:08:59.676 "bdev_name": "Malloc0" 00:08:59.676 }, 00:08:59.676 { 00:08:59.676 "nbd_device": "/dev/nbd1", 00:08:59.676 "bdev_name": "Malloc1" 00:08:59.676 } 00:08:59.676 ]' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:59.676 { 00:08:59.676 "nbd_device": "/dev/nbd0", 00:08:59.676 "bdev_name": "Malloc0" 00:08:59.676 }, 00:08:59.676 { 00:08:59.676 "nbd_device": "/dev/nbd1", 00:08:59.676 "bdev_name": "Malloc1" 00:08:59.676 } 00:08:59.676 ]' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:59.676 /dev/nbd1' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:59.676 /dev/nbd1' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:59.676 256+0 records in 00:08:59.676 256+0 records out 00:08:59.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011998 s, 87.4 MB/s 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:59.676 256+0 records in 00:08:59.676 256+0 records out 00:08:59.676 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0150045 s, 69.9 MB/s 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:59.676 15:45:20 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:59.936 256+0 records in 00:08:59.936 256+0 records out 00:08:59.936 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0161586 s, 64.9 MB/s 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:59.936 15:45:20 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:00.196 15:45:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:00.455 15:45:20 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:09:00.455 15:45:20 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:09:00.715 15:45:21 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:09:00.715 [2024-07-12 15:45:21.126312] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:00.975 [2024-07-12 15:45:21.188673] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:00.975 [2024-07-12 15:45:21.188677] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.975 [2024-07-12 15:45:21.219488] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:09:00.975 [2024-07-12 15:45:21.219521] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:04.270 15:45:24 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2463622 /var/tmp/spdk-nbd.sock 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 2463622 ']' 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:04.270 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:04.270 15:45:24 event.app_repeat -- event/event.sh@39 -- # killprocess 2463622 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 2463622 ']' 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 2463622 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2463622 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2463622' 00:09:04.270 killing process with pid 2463622 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@967 -- # kill 2463622 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@972 -- # wait 2463622 00:09:04.270 spdk_app_start is called in Round 0. 00:09:04.270 Shutdown signal received, stop current app iteration 00:09:04.270 Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 reinitialization... 00:09:04.270 spdk_app_start is called in Round 1. 00:09:04.270 Shutdown signal received, stop current app iteration 00:09:04.270 Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 reinitialization... 00:09:04.270 spdk_app_start is called in Round 2. 00:09:04.270 Shutdown signal received, stop current app iteration 00:09:04.270 Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 reinitialization... 00:09:04.270 spdk_app_start is called in Round 3. 00:09:04.270 Shutdown signal received, stop current app iteration 00:09:04.270 15:45:24 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:09:04.270 15:45:24 event.app_repeat -- event/event.sh@42 -- # return 0 00:09:04.270 00:09:04.270 real 0m16.632s 00:09:04.270 user 0m36.598s 00:09:04.270 sys 0m2.332s 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.270 15:45:24 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:04.270 ************************************ 00:09:04.270 END TEST app_repeat 00:09:04.270 ************************************ 00:09:04.270 15:45:24 event -- common/autotest_common.sh@1142 -- # return 0 00:09:04.270 15:45:24 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:09:04.270 00:09:04.270 real 0m26.336s 00:09:04.270 user 0m53.899s 00:09:04.270 sys 0m3.468s 00:09:04.270 15:45:24 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.270 15:45:24 event -- common/autotest_common.sh@10 -- # set +x 00:09:04.270 ************************************ 00:09:04.270 END TEST event 00:09:04.270 ************************************ 00:09:04.270 15:45:24 -- common/autotest_common.sh@1142 -- # return 0 00:09:04.270 15:45:24 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:04.270 15:45:24 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:04.270 15:45:24 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.270 15:45:24 -- common/autotest_common.sh@10 -- # set +x 00:09:04.270 ************************************ 00:09:04.270 START TEST thread 00:09:04.270 ************************************ 00:09:04.270 15:45:24 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:04.270 * Looking for test storage... 00:09:04.270 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:09:04.270 15:45:24 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:04.270 15:45:24 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:04.270 15:45:24 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.270 15:45:24 thread -- common/autotest_common.sh@10 -- # set +x 00:09:04.270 ************************************ 00:09:04.270 START TEST thread_poller_perf 00:09:04.270 ************************************ 00:09:04.270 15:45:24 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:04.270 [2024-07-12 15:45:24.671161] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:04.270 [2024-07-12 15:45:24.671225] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2467064 ] 00:09:04.532 [2024-07-12 15:45:24.764158] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.532 [2024-07-12 15:45:24.838802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.532 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:09:05.473 ====================================== 00:09:05.473 busy:2608351724 (cyc) 00:09:05.473 total_run_count: 311000 00:09:05.473 tsc_hz: 2600000000 (cyc) 00:09:05.473 ====================================== 00:09:05.473 poller_cost: 8386 (cyc), 3225 (nsec) 00:09:05.473 00:09:05.473 real 0m1.251s 00:09:05.473 user 0m1.146s 00:09:05.473 sys 0m0.100s 00:09:05.473 15:45:25 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:05.473 15:45:25 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:05.473 ************************************ 00:09:05.473 END TEST thread_poller_perf 00:09:05.473 ************************************ 00:09:05.733 15:45:25 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:05.733 15:45:25 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:05.733 15:45:25 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:05.733 15:45:25 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:05.733 15:45:25 thread -- common/autotest_common.sh@10 -- # set +x 00:09:05.733 ************************************ 00:09:05.733 START TEST thread_poller_perf 00:09:05.733 ************************************ 00:09:05.733 15:45:25 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:05.733 [2024-07-12 15:45:25.996248] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:05.733 [2024-07-12 15:45:25.996314] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2467282 ] 00:09:05.733 [2024-07-12 15:45:26.067638] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.733 [2024-07-12 15:45:26.131557] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:05.733 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:09:07.117 ====================================== 00:09:07.117 busy:2602244640 (cyc) 00:09:07.117 total_run_count: 4123000 00:09:07.117 tsc_hz: 2600000000 (cyc) 00:09:07.117 ====================================== 00:09:07.117 poller_cost: 631 (cyc), 242 (nsec) 00:09:07.117 00:09:07.117 real 0m1.213s 00:09:07.117 user 0m1.129s 00:09:07.117 sys 0m0.080s 00:09:07.117 15:45:27 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.117 15:45:27 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:07.117 ************************************ 00:09:07.117 END TEST thread_poller_perf 00:09:07.117 ************************************ 00:09:07.117 15:45:27 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:07.117 15:45:27 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:09:07.117 00:09:07.117 real 0m2.711s 00:09:07.117 user 0m2.372s 00:09:07.117 sys 0m0.342s 00:09:07.117 15:45:27 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.117 15:45:27 thread -- common/autotest_common.sh@10 -- # set +x 00:09:07.117 ************************************ 00:09:07.117 END TEST thread 00:09:07.117 ************************************ 00:09:07.117 15:45:27 -- common/autotest_common.sh@1142 -- # return 0 00:09:07.117 15:45:27 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:07.117 15:45:27 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:07.117 15:45:27 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.117 15:45:27 -- common/autotest_common.sh@10 -- # set +x 00:09:07.117 ************************************ 00:09:07.117 START TEST accel 00:09:07.117 ************************************ 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:07.117 * Looking for test storage... 00:09:07.117 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:07.117 15:45:27 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:09:07.117 15:45:27 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:09:07.117 15:45:27 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:07.117 15:45:27 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2467482 00:09:07.117 15:45:27 accel -- accel/accel.sh@63 -- # waitforlisten 2467482 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@829 -- # '[' -z 2467482 ']' 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:07.117 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:07.117 15:45:27 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:07.117 15:45:27 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:07.117 15:45:27 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.117 15:45:27 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.117 15:45:27 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.117 15:45:27 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.117 15:45:27 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.117 15:45:27 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.117 15:45:27 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:07.117 15:45:27 accel -- accel/accel.sh@41 -- # jq -r . 00:09:07.117 [2024-07-12 15:45:27.462296] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:07.117 [2024-07-12 15:45:27.462344] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2467482 ] 00:09:07.117 [2024-07-12 15:45:27.532601] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.377 [2024-07-12 15:45:27.595066] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.948 15:45:28 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:07.948 15:45:28 accel -- common/autotest_common.sh@862 -- # return 0 00:09:07.948 15:45:28 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:07.948 15:45:28 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:07.948 15:45:28 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:07.948 15:45:28 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:09:07.948 15:45:28 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:07.948 15:45:28 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:07.948 15:45:28 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:07.948 15:45:28 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:07.948 15:45:28 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.948 15:45:28 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:07.948 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.948 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.948 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.948 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.948 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.948 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.948 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.948 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.948 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.948 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # IFS== 00:09:07.949 15:45:28 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:07.949 15:45:28 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:07.949 15:45:28 accel -- accel/accel.sh@75 -- # killprocess 2467482 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@948 -- # '[' -z 2467482 ']' 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@952 -- # kill -0 2467482 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@953 -- # uname 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2467482 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2467482' 00:09:07.949 killing process with pid 2467482 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@967 -- # kill 2467482 00:09:07.949 15:45:28 accel -- common/autotest_common.sh@972 -- # wait 2467482 00:09:08.209 15:45:28 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:08.209 15:45:28 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:09:08.209 15:45:28 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:08.209 15:45:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:08.209 15:45:28 accel -- common/autotest_common.sh@10 -- # set +x 00:09:08.209 15:45:28 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:09:08.209 15:45:28 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:09:08.507 15:45:28 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:08.507 15:45:28 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:09:08.507 15:45:28 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:08.507 15:45:28 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:09:08.507 15:45:28 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:08.507 15:45:28 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:08.507 15:45:28 accel -- common/autotest_common.sh@10 -- # set +x 00:09:08.507 ************************************ 00:09:08.507 START TEST accel_missing_filename 00:09:08.507 ************************************ 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:08.507 15:45:28 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:09:08.507 15:45:28 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:09:08.507 15:45:28 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:09:08.507 15:45:28 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:09:08.508 15:45:28 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:09:08.508 [2024-07-12 15:45:28.778923] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:08.508 [2024-07-12 15:45:28.778983] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2467802 ] 00:09:08.508 [2024-07-12 15:45:28.847332] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.508 [2024-07-12 15:45:28.910058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.508 [2024-07-12 15:45:28.950033] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:08.768 [2024-07-12 15:45:28.985994] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:09:08.768 A filename is required. 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:08.768 00:09:08.768 real 0m0.294s 00:09:08.768 user 0m0.209s 00:09:08.768 sys 0m0.118s 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:08.768 15:45:29 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:09:08.768 ************************************ 00:09:08.768 END TEST accel_missing_filename 00:09:08.768 ************************************ 00:09:08.768 15:45:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:08.768 15:45:29 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:08.768 15:45:29 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:08.768 15:45:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:08.768 15:45:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:08.768 ************************************ 00:09:08.768 START TEST accel_compress_verify 00:09:08.768 ************************************ 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:08.768 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:08.768 15:45:29 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:09:08.768 [2024-07-12 15:45:29.158872] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:08.768 [2024-07-12 15:45:29.158991] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2467848 ] 00:09:09.029 [2024-07-12 15:45:29.304568] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.029 [2024-07-12 15:45:29.379603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.029 [2024-07-12 15:45:29.422958] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:09.029 [2024-07-12 15:45:29.461106] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:09:09.290 00:09:09.290 Compression does not support the verify option, aborting. 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:09.290 00:09:09.290 real 0m0.398s 00:09:09.290 user 0m0.248s 00:09:09.290 sys 0m0.180s 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.290 15:45:29 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:09:09.290 ************************************ 00:09:09.290 END TEST accel_compress_verify 00:09:09.290 ************************************ 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:09.290 15:45:29 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.290 ************************************ 00:09:09.290 START TEST accel_wrong_workload 00:09:09.290 ************************************ 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:09:09.290 15:45:29 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:09:09.290 Unsupported workload type: foobar 00:09:09.290 [2024-07-12 15:45:29.620580] app.c:1459:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:09:09.290 accel_perf options: 00:09:09.290 [-h help message] 00:09:09.290 [-q queue depth per core] 00:09:09.290 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:09.290 [-T number of threads per core 00:09:09.290 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:09.290 [-t time in seconds] 00:09:09.290 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:09.290 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:09.290 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:09.290 [-l for compress/decompress workloads, name of uncompressed input file 00:09:09.290 [-S for crc32c workload, use this seed value (default 0) 00:09:09.290 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:09.290 [-f for fill workload, use this BYTE value (default 255) 00:09:09.290 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:09.290 [-y verify result if this switch is on] 00:09:09.290 [-a tasks to allocate per core (default: same value as -q)] 00:09:09.290 Can be used to spread operations across a wider range of memory. 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:09.290 00:09:09.290 real 0m0.042s 00:09:09.290 user 0m0.024s 00:09:09.290 sys 0m0.018s 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.290 15:45:29 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:09:09.290 ************************************ 00:09:09.290 END TEST accel_wrong_workload 00:09:09.290 ************************************ 00:09:09.290 Error: writing output failed: Broken pipe 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:09.290 15:45:29 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:09.290 15:45:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.291 15:45:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.291 ************************************ 00:09:09.291 START TEST accel_negative_buffers 00:09:09.291 ************************************ 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:09.291 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:09:09.291 15:45:29 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:09:09.552 -x option must be non-negative. 00:09:09.552 [2024-07-12 15:45:29.746911] app.c:1459:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:09:09.552 accel_perf options: 00:09:09.552 [-h help message] 00:09:09.552 [-q queue depth per core] 00:09:09.552 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:09.552 [-T number of threads per core 00:09:09.552 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:09.552 [-t time in seconds] 00:09:09.552 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:09.552 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:09.552 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:09.552 [-l for compress/decompress workloads, name of uncompressed input file 00:09:09.552 [-S for crc32c workload, use this seed value (default 0) 00:09:09.552 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:09.552 [-f for fill workload, use this BYTE value (default 255) 00:09:09.552 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:09.552 [-y verify result if this switch is on] 00:09:09.552 [-a tasks to allocate per core (default: same value as -q)] 00:09:09.552 Can be used to spread operations across a wider range of memory. 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:09.552 00:09:09.552 real 0m0.048s 00:09:09.552 user 0m0.070s 00:09:09.552 sys 0m0.019s 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.552 15:45:29 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:09:09.552 ************************************ 00:09:09.552 END TEST accel_negative_buffers 00:09:09.552 ************************************ 00:09:09.552 15:45:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:09.552 15:45:29 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:09:09.552 15:45:29 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:09.552 15:45:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.552 15:45:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.552 ************************************ 00:09:09.552 START TEST accel_crc32c 00:09:09.552 ************************************ 00:09:09.552 15:45:29 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:09.552 15:45:29 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:09.552 [2024-07-12 15:45:29.865283] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:09.552 [2024-07-12 15:45:29.865342] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2468185 ] 00:09:09.552 [2024-07-12 15:45:29.952863] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.814 [2024-07-12 15:45:30.016503] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:09.814 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.815 15:45:30 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.756 15:45:31 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:10.757 15:45:31 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:10.757 00:09:10.757 real 0m1.326s 00:09:10.757 user 0m1.208s 00:09:10.757 sys 0m0.115s 00:09:10.757 15:45:31 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.757 15:45:31 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:10.757 ************************************ 00:09:10.757 END TEST accel_crc32c 00:09:10.757 ************************************ 00:09:10.757 15:45:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:10.757 15:45:31 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:09:10.757 15:45:31 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:10.757 15:45:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:10.757 15:45:31 accel -- common/autotest_common.sh@10 -- # set +x 00:09:11.017 ************************************ 00:09:11.017 START TEST accel_crc32c_C2 00:09:11.017 ************************************ 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:11.017 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:11.017 [2024-07-12 15:45:31.267954] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:11.017 [2024-07-12 15:45:31.268016] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2468298 ] 00:09:11.017 [2024-07-12 15:45:31.354258] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.017 [2024-07-12 15:45:31.426278] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:11.278 15:45:31 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:12.217 00:09:12.217 real 0m1.332s 00:09:12.217 user 0m1.198s 00:09:12.217 sys 0m0.133s 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.217 15:45:32 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:12.217 ************************************ 00:09:12.217 END TEST accel_crc32c_C2 00:09:12.217 ************************************ 00:09:12.217 15:45:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:12.217 15:45:32 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:09:12.217 15:45:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:12.217 15:45:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.217 15:45:32 accel -- common/autotest_common.sh@10 -- # set +x 00:09:12.217 ************************************ 00:09:12.217 START TEST accel_copy 00:09:12.217 ************************************ 00:09:12.217 15:45:32 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:12.217 15:45:32 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:09:12.477 [2024-07-12 15:45:32.674541] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:12.477 [2024-07-12 15:45:32.674605] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2468536 ] 00:09:12.477 [2024-07-12 15:45:32.767444] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.477 [2024-07-12 15:45:32.841100] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.477 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.478 15:45:32 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:09:13.859 15:45:33 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:13.859 00:09:13.859 real 0m1.339s 00:09:13.859 user 0m1.206s 00:09:13.859 sys 0m0.133s 00:09:13.859 15:45:33 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:13.859 15:45:33 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:09:13.859 ************************************ 00:09:13.859 END TEST accel_copy 00:09:13.860 ************************************ 00:09:13.860 15:45:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:13.860 15:45:34 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:13.860 15:45:34 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:13.860 15:45:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:13.860 15:45:34 accel -- common/autotest_common.sh@10 -- # set +x 00:09:13.860 ************************************ 00:09:13.860 START TEST accel_fill 00:09:13.860 ************************************ 00:09:13.860 15:45:34 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:09:13.860 [2024-07-12 15:45:34.086671] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:13.860 [2024-07-12 15:45:34.086734] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2468860 ] 00:09:13.860 [2024-07-12 15:45:34.174416] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.860 [2024-07-12 15:45:34.239743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.860 15:45:34 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:09:15.244 15:45:35 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:15.244 00:09:15.244 real 0m1.321s 00:09:15.244 user 0m1.205s 00:09:15.244 sys 0m0.118s 00:09:15.244 15:45:35 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:15.244 15:45:35 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:09:15.244 ************************************ 00:09:15.244 END TEST accel_fill 00:09:15.244 ************************************ 00:09:15.244 15:45:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:15.244 15:45:35 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:09:15.244 15:45:35 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:15.244 15:45:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:15.244 15:45:35 accel -- common/autotest_common.sh@10 -- # set +x 00:09:15.244 ************************************ 00:09:15.244 START TEST accel_copy_crc32c 00:09:15.244 ************************************ 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:15.244 [2024-07-12 15:45:35.480018] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:15.244 [2024-07-12 15:45:35.480084] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2469177 ] 00:09:15.244 [2024-07-12 15:45:35.567364] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:15.244 [2024-07-12 15:45:35.630857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.244 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:15.245 15:45:35 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:16.631 00:09:16.631 real 0m1.321s 00:09:16.631 user 0m1.210s 00:09:16.631 sys 0m0.111s 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:16.631 15:45:36 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:16.631 ************************************ 00:09:16.631 END TEST accel_copy_crc32c 00:09:16.631 ************************************ 00:09:16.631 15:45:36 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:16.631 15:45:36 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:09:16.631 15:45:36 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:16.631 15:45:36 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.631 15:45:36 accel -- common/autotest_common.sh@10 -- # set +x 00:09:16.631 ************************************ 00:09:16.631 START TEST accel_copy_crc32c_C2 00:09:16.631 ************************************ 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:16.631 15:45:36 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:16.631 [2024-07-12 15:45:36.876741] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:16.631 [2024-07-12 15:45:36.876803] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2469367 ] 00:09:16.631 [2024-07-12 15:45:36.963320] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.631 [2024-07-12 15:45:37.037070] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.891 15:45:37 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:17.833 00:09:17.833 real 0m1.336s 00:09:17.833 user 0m1.209s 00:09:17.833 sys 0m0.124s 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.833 15:45:38 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:17.833 ************************************ 00:09:17.833 END TEST accel_copy_crc32c_C2 00:09:17.833 ************************************ 00:09:17.833 15:45:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:17.833 15:45:38 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:09:17.833 15:45:38 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:17.833 15:45:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.833 15:45:38 accel -- common/autotest_common.sh@10 -- # set +x 00:09:17.833 ************************************ 00:09:17.833 START TEST accel_dualcast 00:09:17.833 ************************************ 00:09:17.833 15:45:38 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:09:17.833 15:45:38 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:09:18.094 [2024-07-12 15:45:38.287196] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:18.094 [2024-07-12 15:45:38.287254] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2469553 ] 00:09:18.094 [2024-07-12 15:45:38.372699] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.094 [2024-07-12 15:45:38.447299] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.094 15:45:38 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:09:19.478 15:45:39 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:19.478 00:09:19.478 real 0m1.336s 00:09:19.478 user 0m1.208s 00:09:19.478 sys 0m0.124s 00:09:19.478 15:45:39 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:19.478 15:45:39 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:09:19.478 ************************************ 00:09:19.478 END TEST accel_dualcast 00:09:19.478 ************************************ 00:09:19.478 15:45:39 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:19.478 15:45:39 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:09:19.478 15:45:39 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:19.478 15:45:39 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.478 15:45:39 accel -- common/autotest_common.sh@10 -- # set +x 00:09:19.478 ************************************ 00:09:19.478 START TEST accel_compare 00:09:19.478 ************************************ 00:09:19.478 15:45:39 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:09:19.478 15:45:39 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:09:19.478 15:45:39 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:09:19.478 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.478 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.478 15:45:39 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:09:19.479 [2024-07-12 15:45:39.699217] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:19.479 [2024-07-12 15:45:39.699275] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2469853 ] 00:09:19.479 [2024-07-12 15:45:39.797867] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.479 [2024-07-12 15:45:39.862499] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.479 15:45:39 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:09:20.859 15:45:40 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:20.859 00:09:20.859 real 0m1.330s 00:09:20.859 user 0m1.205s 00:09:20.859 sys 0m0.127s 00:09:20.859 15:45:40 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.859 15:45:40 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:09:20.859 ************************************ 00:09:20.859 END TEST accel_compare 00:09:20.859 ************************************ 00:09:20.859 15:45:41 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:20.859 15:45:41 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:09:20.859 15:45:41 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:20.859 15:45:41 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.859 15:45:41 accel -- common/autotest_common.sh@10 -- # set +x 00:09:20.859 ************************************ 00:09:20.859 START TEST accel_xor 00:09:20.859 ************************************ 00:09:20.859 15:45:41 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:20.859 [2024-07-12 15:45:41.102301] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:20.859 [2024-07-12 15:45:41.102350] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2470174 ] 00:09:20.859 [2024-07-12 15:45:41.189556] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:20.859 [2024-07-12 15:45:41.256402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.859 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.119 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.120 15:45:41 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:22.060 00:09:22.060 real 0m1.324s 00:09:22.060 user 0m1.211s 00:09:22.060 sys 0m0.114s 00:09:22.060 15:45:42 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:22.060 15:45:42 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:22.060 ************************************ 00:09:22.060 END TEST accel_xor 00:09:22.060 ************************************ 00:09:22.060 15:45:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:22.060 15:45:42 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:09:22.060 15:45:42 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:22.060 15:45:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:22.060 15:45:42 accel -- common/autotest_common.sh@10 -- # set +x 00:09:22.060 ************************************ 00:09:22.060 START TEST accel_xor 00:09:22.060 ************************************ 00:09:22.060 15:45:42 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:22.060 15:45:42 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:22.060 [2024-07-12 15:45:42.502099] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:22.060 [2024-07-12 15:45:42.502161] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2470462 ] 00:09:22.321 [2024-07-12 15:45:42.590018] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.321 [2024-07-12 15:45:42.667587] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.321 15:45:42 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.322 15:45:42 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.322 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.322 15:45:42 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:23.704 15:45:43 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:23.704 00:09:23.704 real 0m1.336s 00:09:23.704 user 0m1.200s 00:09:23.704 sys 0m0.137s 00:09:23.704 15:45:43 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:23.704 15:45:43 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:23.704 ************************************ 00:09:23.704 END TEST accel_xor 00:09:23.704 ************************************ 00:09:23.704 15:45:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:23.704 15:45:43 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:09:23.705 15:45:43 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:23.705 15:45:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:23.705 15:45:43 accel -- common/autotest_common.sh@10 -- # set +x 00:09:23.705 ************************************ 00:09:23.705 START TEST accel_dif_verify 00:09:23.705 ************************************ 00:09:23.705 15:45:43 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:23.705 15:45:43 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:09:23.705 [2024-07-12 15:45:43.910450] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:23.705 [2024-07-12 15:45:43.910509] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2470569 ] 00:09:23.705 [2024-07-12 15:45:43.999391] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.705 [2024-07-12 15:45:44.077840] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:23.705 15:45:44 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:09:25.088 15:45:45 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:25.088 00:09:25.088 real 0m1.345s 00:09:25.088 user 0m1.217s 00:09:25.088 sys 0m0.124s 00:09:25.088 15:45:45 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.088 15:45:45 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:09:25.088 ************************************ 00:09:25.088 END TEST accel_dif_verify 00:09:25.088 ************************************ 00:09:25.088 15:45:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:25.088 15:45:45 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:09:25.088 15:45:45 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:25.088 15:45:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.088 15:45:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:25.088 ************************************ 00:09:25.088 START TEST accel_dif_generate 00:09:25.088 ************************************ 00:09:25.088 15:45:45 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:09:25.088 [2024-07-12 15:45:45.329911] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:25.088 [2024-07-12 15:45:45.329971] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2470852 ] 00:09:25.088 [2024-07-12 15:45:45.427597] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.088 [2024-07-12 15:45:45.492724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.088 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.348 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.349 15:45:45 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:09:26.290 15:45:46 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:26.290 00:09:26.290 real 0m1.330s 00:09:26.290 user 0m1.204s 00:09:26.290 sys 0m0.130s 00:09:26.290 15:45:46 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:26.290 15:45:46 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:09:26.290 ************************************ 00:09:26.290 END TEST accel_dif_generate 00:09:26.290 ************************************ 00:09:26.290 15:45:46 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:26.290 15:45:46 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:09:26.290 15:45:46 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:26.290 15:45:46 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:26.290 15:45:46 accel -- common/autotest_common.sh@10 -- # set +x 00:09:26.290 ************************************ 00:09:26.290 START TEST accel_dif_generate_copy 00:09:26.290 ************************************ 00:09:26.290 15:45:46 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:09:26.290 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:26.290 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:26.291 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:09:26.291 [2024-07-12 15:45:46.736851] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:26.291 [2024-07-12 15:45:46.736959] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2471167 ] 00:09:26.551 [2024-07-12 15:45:46.833191] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:26.551 [2024-07-12 15:45:46.909516] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:26.551 15:45:46 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:27.931 00:09:27.931 real 0m1.345s 00:09:27.931 user 0m1.214s 00:09:27.931 sys 0m0.134s 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:27.931 15:45:48 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:09:27.931 ************************************ 00:09:27.932 END TEST accel_dif_generate_copy 00:09:27.932 ************************************ 00:09:27.932 15:45:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:27.932 15:45:48 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:09:27.932 15:45:48 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.932 15:45:48 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:27.932 15:45:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:27.932 15:45:48 accel -- common/autotest_common.sh@10 -- # set +x 00:09:27.932 ************************************ 00:09:27.932 START TEST accel_comp 00:09:27.932 ************************************ 00:09:27.932 15:45:48 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:09:27.932 [2024-07-12 15:45:48.150898] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:27.932 [2024-07-12 15:45:48.150967] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2471456 ] 00:09:27.932 [2024-07-12 15:45:48.239392] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.932 [2024-07-12 15:45:48.308383] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:27.932 15:45:48 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:29.311 15:45:49 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:29.311 00:09:29.311 real 0m1.325s 00:09:29.311 user 0m1.212s 00:09:29.311 sys 0m0.120s 00:09:29.311 15:45:49 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:29.311 15:45:49 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:09:29.311 ************************************ 00:09:29.311 END TEST accel_comp 00:09:29.311 ************************************ 00:09:29.311 15:45:49 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:29.311 15:45:49 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:29.311 15:45:49 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:29.312 15:45:49 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:29.312 15:45:49 accel -- common/autotest_common.sh@10 -- # set +x 00:09:29.312 ************************************ 00:09:29.312 START TEST accel_decomp 00:09:29.312 ************************************ 00:09:29.312 15:45:49 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:29.312 15:45:49 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:29.312 [2024-07-12 15:45:49.553058] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:29.312 [2024-07-12 15:45:49.553116] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2471569 ] 00:09:29.312 [2024-07-12 15:45:49.641347] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:29.312 [2024-07-12 15:45:49.718091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.572 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:29.573 15:45:49 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:30.514 15:45:50 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:30.514 00:09:30.514 real 0m1.347s 00:09:30.514 user 0m1.215s 00:09:30.514 sys 0m0.129s 00:09:30.514 15:45:50 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.514 15:45:50 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:30.514 ************************************ 00:09:30.514 END TEST accel_decomp 00:09:30.514 ************************************ 00:09:30.514 15:45:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:30.514 15:45:50 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:30.514 15:45:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:30.514 15:45:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.514 15:45:50 accel -- common/autotest_common.sh@10 -- # set +x 00:09:30.514 ************************************ 00:09:30.514 START TEST accel_decomp_full 00:09:30.514 ************************************ 00:09:30.514 15:45:50 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:30.514 15:45:50 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:30.514 15:45:50 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:30.514 15:45:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.514 15:45:50 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:30.515 15:45:50 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:30.774 [2024-07-12 15:45:50.975736] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:30.774 [2024-07-12 15:45:50.975794] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2471848 ] 00:09:30.774 [2024-07-12 15:45:51.066884] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.774 [2024-07-12 15:45:51.143053] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.774 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:30.775 15:45:51 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:32.158 15:45:52 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:32.158 00:09:32.158 real 0m1.358s 00:09:32.158 user 0m1.231s 00:09:32.158 sys 0m0.123s 00:09:32.158 15:45:52 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:32.158 15:45:52 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:32.158 ************************************ 00:09:32.158 END TEST accel_decomp_full 00:09:32.158 ************************************ 00:09:32.158 15:45:52 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:32.158 15:45:52 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:32.158 15:45:52 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:32.158 15:45:52 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:32.158 15:45:52 accel -- common/autotest_common.sh@10 -- # set +x 00:09:32.158 ************************************ 00:09:32.158 START TEST accel_decomp_mcore 00:09:32.158 ************************************ 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:32.158 [2024-07-12 15:45:52.405492] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:32.158 [2024-07-12 15:45:52.405557] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2472168 ] 00:09:32.158 [2024-07-12 15:45:52.491403] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:32.158 [2024-07-12 15:45:52.557904] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:32.158 [2024-07-12 15:45:52.558049] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:32.158 [2024-07-12 15:45:52.558191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.158 [2024-07-12 15:45:52.558191] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.158 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.418 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:32.419 15:45:52 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.371 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:33.372 00:09:33.372 real 0m1.338s 00:09:33.372 user 0m4.492s 00:09:33.372 sys 0m0.130s 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:33.372 15:45:53 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:33.372 ************************************ 00:09:33.372 END TEST accel_decomp_mcore 00:09:33.372 ************************************ 00:09:33.372 15:45:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:33.372 15:45:53 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:33.372 15:45:53 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:33.372 15:45:53 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:33.372 15:45:53 accel -- common/autotest_common.sh@10 -- # set +x 00:09:33.372 ************************************ 00:09:33.372 START TEST accel_decomp_full_mcore 00:09:33.372 ************************************ 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:33.372 15:45:53 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:33.372 [2024-07-12 15:45:53.818035] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:33.372 [2024-07-12 15:45:53.818151] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2472488 ] 00:09:33.632 [2024-07-12 15:45:53.900202] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:33.632 [2024-07-12 15:45:53.964756] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:33.632 [2024-07-12 15:45:53.964902] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:33.632 [2024-07-12 15:45:53.965047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.632 [2024-07-12 15:45:53.965047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.632 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.633 15:45:54 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:35.016 00:09:35.016 real 0m1.356s 00:09:35.016 user 0m4.586s 00:09:35.016 sys 0m0.118s 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.016 15:45:55 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:35.016 ************************************ 00:09:35.016 END TEST accel_decomp_full_mcore 00:09:35.016 ************************************ 00:09:35.016 15:45:55 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:35.016 15:45:55 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:35.016 15:45:55 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:35.016 15:45:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:35.016 15:45:55 accel -- common/autotest_common.sh@10 -- # set +x 00:09:35.016 ************************************ 00:09:35.016 START TEST accel_decomp_mthread 00:09:35.016 ************************************ 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:35.016 [2024-07-12 15:45:55.242992] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:35.016 [2024-07-12 15:45:55.243042] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2472690 ] 00:09:35.016 [2024-07-12 15:45:55.310369] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:35.016 [2024-07-12 15:45:55.372284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:35.016 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:35.017 15:45:55 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:36.397 00:09:36.397 real 0m1.307s 00:09:36.397 user 0m1.206s 00:09:36.397 sys 0m0.100s 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:36.397 15:45:56 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:36.397 ************************************ 00:09:36.397 END TEST accel_decomp_mthread 00:09:36.397 ************************************ 00:09:36.397 15:45:56 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:36.397 15:45:56 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:36.397 15:45:56 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:36.397 15:45:56 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:36.397 15:45:56 accel -- common/autotest_common.sh@10 -- # set +x 00:09:36.397 ************************************ 00:09:36.397 START TEST accel_decomp_full_mthread 00:09:36.398 ************************************ 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:36.398 [2024-07-12 15:45:56.625706] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:36.398 [2024-07-12 15:45:56.625778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2472863 ] 00:09:36.398 [2024-07-12 15:45:56.714278] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:36.398 [2024-07-12 15:45:56.787502] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.398 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:36.658 15:45:56 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:37.596 00:09:37.596 real 0m1.371s 00:09:37.596 user 0m1.238s 00:09:37.596 sys 0m0.128s 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.596 15:45:57 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:37.596 ************************************ 00:09:37.596 END TEST accel_decomp_full_mthread 00:09:37.596 ************************************ 00:09:37.596 15:45:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:37.596 15:45:57 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:09:37.596 15:45:57 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:09:37.596 15:45:57 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:09:37.596 15:45:57 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:37.596 15:45:57 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2473163 00:09:37.596 15:45:57 accel -- accel/accel.sh@63 -- # waitforlisten 2473163 00:09:37.596 15:45:57 accel -- common/autotest_common.sh@829 -- # '[' -z 2473163 ']' 00:09:37.596 15:45:57 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:37.596 15:45:57 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:37.596 15:45:57 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:37.597 15:45:57 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:37.597 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:37.597 15:45:57 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:37.597 15:45:57 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:37.597 15:45:57 accel -- common/autotest_common.sh@10 -- # set +x 00:09:37.597 15:45:57 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:37.597 15:45:57 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:37.597 15:45:57 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:37.597 15:45:57 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:37.597 15:45:57 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:37.597 15:45:57 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:37.597 15:45:57 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:37.597 15:45:57 accel -- accel/accel.sh@41 -- # jq -r . 00:09:37.856 [2024-07-12 15:45:58.058330] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:37.856 [2024-07-12 15:45:58.058384] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2473163 ] 00:09:37.856 [2024-07-12 15:45:58.146111] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:37.856 [2024-07-12 15:45:58.222149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:38.426 [2024-07-12 15:45:58.628011] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:38.686 15:45:58 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:38.686 15:45:58 accel -- common/autotest_common.sh@862 -- # return 0 00:09:38.686 15:45:58 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:38.686 15:45:58 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:38.686 15:45:58 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:38.686 15:45:58 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:09:38.686 15:45:58 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:09:38.686 15:45:58 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:09:38.686 15:45:58 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:38.686 15:45:58 accel -- common/autotest_common.sh@10 -- # set +x 00:09:38.686 15:45:58 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:09:38.686 15:45:58 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:38.686 "method": "compressdev_scan_accel_module", 00:09:38.686 15:45:59 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:38.686 15:45:59 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:38.686 15:45:59 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@10 -- # set +x 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # IFS== 00:09:38.686 15:45:59 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:38.686 15:45:59 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:38.686 15:45:59 accel -- accel/accel.sh@75 -- # killprocess 2473163 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@948 -- # '[' -z 2473163 ']' 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@952 -- # kill -0 2473163 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@953 -- # uname 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:38.686 15:45:59 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2473163 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2473163' 00:09:38.946 killing process with pid 2473163 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@967 -- # kill 2473163 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@972 -- # wait 2473163 00:09:38.946 15:45:59 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:38.946 15:45:59 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:38.946 15:45:59 accel -- common/autotest_common.sh@10 -- # set +x 00:09:38.946 ************************************ 00:09:38.946 START TEST accel_cdev_comp 00:09:38.946 ************************************ 00:09:38.946 15:45:59 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:38.946 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:09:39.206 [2024-07-12 15:45:59.419070] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:39.206 [2024-07-12 15:45:59.419140] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2473481 ] 00:09:39.206 [2024-07-12 15:45:59.504627] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:39.206 [2024-07-12 15:45:59.568976] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:39.775 [2024-07-12 15:45:59.968609] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:39.775 [2024-07-12 15:45:59.970354] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1147830 PMD being used: compress_qat 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 [2024-07-12 15:45:59.973398] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x134c650 PMD being used: compress_qat 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:39.775 15:45:59 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:40.777 15:46:01 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:40.777 00:09:40.777 real 0m1.689s 00:09:40.777 user 0m1.403s 00:09:40.777 sys 0m0.287s 00:09:40.777 15:46:01 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:40.777 15:46:01 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:09:40.777 ************************************ 00:09:40.777 END TEST accel_cdev_comp 00:09:40.777 ************************************ 00:09:40.777 15:46:01 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:40.777 15:46:01 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:40.777 15:46:01 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:40.777 15:46:01 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:40.777 15:46:01 accel -- common/autotest_common.sh@10 -- # set +x 00:09:40.777 ************************************ 00:09:40.777 START TEST accel_cdev_decomp 00:09:40.777 ************************************ 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:40.777 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:40.777 [2024-07-12 15:46:01.182907] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:40.778 [2024-07-12 15:46:01.182957] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2473808 ] 00:09:41.038 [2024-07-12 15:46:01.268458] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:41.038 [2024-07-12 15:46:01.333141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.298 [2024-07-12 15:46:01.743597] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:41.299 [2024-07-12 15:46:01.745362] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xe80830 PMD being used: compress_qat 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.558 [2024-07-12 15:46:01.748531] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1085650 PMD being used: compress_qat 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.558 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:41.559 15:46:01 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:42.501 00:09:42.501 real 0m1.701s 00:09:42.501 user 0m1.399s 00:09:42.501 sys 0m0.304s 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:42.501 15:46:02 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:42.501 ************************************ 00:09:42.501 END TEST accel_cdev_decomp 00:09:42.501 ************************************ 00:09:42.501 15:46:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:42.501 15:46:02 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:42.501 15:46:02 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:42.501 15:46:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:42.501 15:46:02 accel -- common/autotest_common.sh@10 -- # set +x 00:09:42.501 ************************************ 00:09:42.501 START TEST accel_cdev_decomp_full 00:09:42.501 ************************************ 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:42.501 15:46:02 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:42.763 [2024-07-12 15:46:02.957533] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:42.763 [2024-07-12 15:46:02.957603] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2474135 ] 00:09:42.763 [2024-07-12 15:46:03.046455] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.763 [2024-07-12 15:46:03.123659] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.334 [2024-07-12 15:46:03.521893] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:43.334 [2024-07-12 15:46:03.523658] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12f9830 PMD being used: compress_qat 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 [2024-07-12 15:46:03.525946] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x12fcbe0 PMD being used: compress_qat 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:43.334 15:46:03 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:44.275 00:09:44.275 real 0m1.704s 00:09:44.275 user 0m1.404s 00:09:44.275 sys 0m0.300s 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.275 15:46:04 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:44.275 ************************************ 00:09:44.275 END TEST accel_cdev_decomp_full 00:09:44.275 ************************************ 00:09:44.275 15:46:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:44.275 15:46:04 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:44.275 15:46:04 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:44.275 15:46:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:44.275 15:46:04 accel -- common/autotest_common.sh@10 -- # set +x 00:09:44.275 ************************************ 00:09:44.275 START TEST accel_cdev_decomp_mcore 00:09:44.275 ************************************ 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:44.275 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:44.276 15:46:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:44.536 [2024-07-12 15:46:04.734860] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:44.536 [2024-07-12 15:46:04.734913] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2474460 ] 00:09:44.536 [2024-07-12 15:46:04.821123] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:44.536 [2024-07-12 15:46:04.887182] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:44.536 [2024-07-12 15:46:04.887304] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:44.536 [2024-07-12 15:46:04.887420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:44.536 [2024-07-12 15:46:04.887420] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:45.110 [2024-07-12 15:46:05.286862] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:45.110 [2024-07-12 15:46:05.288611] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16dde00 PMD being used: compress_qat 00:09:45.110 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.110 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.110 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.110 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 [2024-07-12 15:46:05.292960] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1a3019b8b0 PMD being used: compress_qat 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:45.111 [2024-07-12 15:46:05.294125] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16e3140 PMD being used: compress_qat 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 [2024-07-12 15:46:05.299648] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1a2819b8b0 PMD being used: compress_qat 00:09:45.111 [2024-07-12 15:46:05.299806] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f1a2019b8b0 PMD being used: compress_qat 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:45.111 15:46:05 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.050 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:46.051 00:09:46.051 real 0m1.711s 00:09:46.051 user 0m5.792s 00:09:46.051 sys 0m0.304s 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:46.051 15:46:06 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:46.051 ************************************ 00:09:46.051 END TEST accel_cdev_decomp_mcore 00:09:46.051 ************************************ 00:09:46.051 15:46:06 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:46.051 15:46:06 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:46.051 15:46:06 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:46.051 15:46:06 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:46.051 15:46:06 accel -- common/autotest_common.sh@10 -- # set +x 00:09:46.051 ************************************ 00:09:46.051 START TEST accel_cdev_decomp_full_mcore 00:09:46.051 ************************************ 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:46.051 15:46:06 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:46.311 [2024-07-12 15:46:06.524620] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:46.311 [2024-07-12 15:46:06.524744] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2474783 ] 00:09:46.311 [2024-07-12 15:46:06.620418] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:46.311 [2024-07-12 15:46:06.697530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:46.311 [2024-07-12 15:46:06.697679] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:46.311 [2024-07-12 15:46:06.697822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:46.311 [2024-07-12 15:46:06.697822] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:46.881 [2024-07-12 15:46:07.103126] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:46.881 [2024-07-12 15:46:07.104888] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25ace00 PMD being used: compress_qat 00:09:46.881 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.881 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.881 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.881 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.881 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 [2024-07-12 15:46:07.108393] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc0bc19b8b0 PMD being used: compress_qat 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:46.882 [2024-07-12 15:46:07.109639] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25b2320 PMD being used: compress_qat 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 [2024-07-12 15:46:07.115273] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc0b419b8b0 PMD being used: compress_qat 00:09:46.882 [2024-07-12 15:46:07.115410] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc0ac19b8b0 PMD being used: compress_qat 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:46.882 15:46:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:47.822 00:09:47.822 real 0m1.748s 00:09:47.822 user 0m5.862s 00:09:47.822 sys 0m0.311s 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:47.822 15:46:08 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:47.822 ************************************ 00:09:47.822 END TEST accel_cdev_decomp_full_mcore 00:09:47.822 ************************************ 00:09:48.083 15:46:08 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:48.083 15:46:08 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:48.083 15:46:08 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:48.083 15:46:08 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.083 15:46:08 accel -- common/autotest_common.sh@10 -- # set +x 00:09:48.083 ************************************ 00:09:48.083 START TEST accel_cdev_decomp_mthread 00:09:48.083 ************************************ 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:48.083 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:48.083 [2024-07-12 15:46:08.346250] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:48.083 [2024-07-12 15:46:08.346316] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2475113 ] 00:09:48.083 [2024-07-12 15:46:08.434897] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:48.083 [2024-07-12 15:46:08.510615] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:48.653 [2024-07-12 15:46:08.910858] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:48.653 [2024-07-12 15:46:08.912624] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10a8830 PMD being used: compress_qat 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 [2024-07-12 15:46:08.915977] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x10adaa0 PMD being used: compress_qat 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 [2024-07-12 15:46:08.917698] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11d04c0 PMD being used: compress_qat 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:48.653 15:46:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:49.594 00:09:49.594 real 0m1.708s 00:09:49.594 user 0m1.422s 00:09:49.594 sys 0m0.288s 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:49.594 15:46:10 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:49.594 ************************************ 00:09:49.594 END TEST accel_cdev_decomp_mthread 00:09:49.594 ************************************ 00:09:49.855 15:46:10 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:49.855 15:46:10 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:49.855 15:46:10 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:49.855 15:46:10 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:49.855 15:46:10 accel -- common/autotest_common.sh@10 -- # set +x 00:09:49.855 ************************************ 00:09:49.855 START TEST accel_cdev_decomp_full_mthread 00:09:49.855 ************************************ 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:49.855 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:49.855 [2024-07-12 15:46:10.130754] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:49.855 [2024-07-12 15:46:10.130815] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2475438 ] 00:09:49.855 [2024-07-12 15:46:10.218317] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:49.855 [2024-07-12 15:46:10.282675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.425 [2024-07-12 15:46:10.678184] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:50.425 [2024-07-12 15:46:10.679914] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2520830 PMD being used: compress_qat 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 [2024-07-12 15:46:10.682438] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25208d0 PMD being used: compress_qat 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:50.425 [2024-07-12 15:46:10.684330] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x27254c0 PMD being used: compress_qat 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:50.425 15:46:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.365 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:51.366 00:09:51.366 real 0m1.692s 00:09:51.366 user 0m1.402s 00:09:51.366 sys 0m0.293s 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:51.366 15:46:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:51.366 ************************************ 00:09:51.366 END TEST accel_cdev_decomp_full_mthread 00:09:51.366 ************************************ 00:09:51.626 15:46:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:51.626 15:46:11 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:09:51.626 15:46:11 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:51.626 15:46:11 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:51.626 15:46:11 accel -- accel/accel.sh@137 -- # build_accel_config 00:09:51.626 15:46:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:51.626 15:46:11 accel -- common/autotest_common.sh@10 -- # set +x 00:09:51.626 15:46:11 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:51.626 15:46:11 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:51.626 15:46:11 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:51.626 15:46:11 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:51.626 15:46:11 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:51.626 15:46:11 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:51.626 15:46:11 accel -- accel/accel.sh@41 -- # jq -r . 00:09:51.626 ************************************ 00:09:51.626 START TEST accel_dif_functional_tests 00:09:51.626 ************************************ 00:09:51.626 15:46:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:51.626 [2024-07-12 15:46:11.931621] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:51.626 [2024-07-12 15:46:11.931679] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2475764 ] 00:09:51.626 [2024-07-12 15:46:12.020446] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:51.886 [2024-07-12 15:46:12.099218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:51.886 [2024-07-12 15:46:12.099363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:51.886 [2024-07-12 15:46:12.099364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:51.886 00:09:51.886 00:09:51.886 CUnit - A unit testing framework for C - Version 2.1-3 00:09:51.886 http://cunit.sourceforge.net/ 00:09:51.886 00:09:51.886 00:09:51.886 Suite: accel_dif 00:09:51.886 Test: verify: DIF generated, GUARD check ...passed 00:09:51.886 Test: verify: DIF generated, APPTAG check ...passed 00:09:51.886 Test: verify: DIF generated, REFTAG check ...passed 00:09:51.886 Test: verify: DIF not generated, GUARD check ...[2024-07-12 15:46:12.167027] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:51.886 passed 00:09:51.886 Test: verify: DIF not generated, APPTAG check ...[2024-07-12 15:46:12.167078] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:51.886 passed 00:09:51.886 Test: verify: DIF not generated, REFTAG check ...[2024-07-12 15:46:12.167102] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:51.886 passed 00:09:51.886 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:51.886 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-12 15:46:12.167153] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:51.886 passed 00:09:51.886 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:51.886 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:51.886 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:51.886 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-12 15:46:12.167270] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:51.886 passed 00:09:51.886 Test: verify copy: DIF generated, GUARD check ...passed 00:09:51.886 Test: verify copy: DIF generated, APPTAG check ...passed 00:09:51.886 Test: verify copy: DIF generated, REFTAG check ...passed 00:09:51.886 Test: verify copy: DIF not generated, GUARD check ...[2024-07-12 15:46:12.167393] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:51.886 passed 00:09:51.887 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-12 15:46:12.167419] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:51.887 passed 00:09:51.887 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-12 15:46:12.167442] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:51.887 passed 00:09:51.887 Test: generate copy: DIF generated, GUARD check ...passed 00:09:51.887 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:51.887 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:51.887 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:51.887 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:51.887 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:51.887 Test: generate copy: iovecs-len validate ...[2024-07-12 15:46:12.167640] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:51.887 passed 00:09:51.887 Test: generate copy: buffer alignment validate ...passed 00:09:51.887 00:09:51.887 Run Summary: Type Total Ran Passed Failed Inactive 00:09:51.887 suites 1 1 n/a 0 0 00:09:51.887 tests 26 26 26 0 0 00:09:51.887 asserts 115 115 115 0 n/a 00:09:51.887 00:09:51.887 Elapsed time = 0.000 seconds 00:09:51.887 00:09:51.887 real 0m0.420s 00:09:51.887 user 0m0.531s 00:09:51.887 sys 0m0.165s 00:09:51.887 15:46:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:51.887 15:46:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:09:51.887 ************************************ 00:09:51.887 END TEST accel_dif_functional_tests 00:09:51.887 ************************************ 00:09:51.887 15:46:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:51.887 00:09:51.887 real 0m45.028s 00:09:51.887 user 0m54.460s 00:09:51.887 sys 0m7.592s 00:09:51.887 15:46:12 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:51.887 15:46:12 accel -- common/autotest_common.sh@10 -- # set +x 00:09:51.887 ************************************ 00:09:51.887 END TEST accel 00:09:51.887 ************************************ 00:09:52.147 15:46:12 -- common/autotest_common.sh@1142 -- # return 0 00:09:52.148 15:46:12 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:52.148 15:46:12 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:52.148 15:46:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.148 15:46:12 -- common/autotest_common.sh@10 -- # set +x 00:09:52.148 ************************************ 00:09:52.148 START TEST accel_rpc 00:09:52.148 ************************************ 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:52.148 * Looking for test storage... 00:09:52.148 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:52.148 15:46:12 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:52.148 15:46:12 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2475831 00:09:52.148 15:46:12 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2475831 00:09:52.148 15:46:12 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 2475831 ']' 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:52.148 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:52.148 15:46:12 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:52.148 [2024-07-12 15:46:12.558456] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:52.148 [2024-07-12 15:46:12.558510] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2475831 ] 00:09:52.408 [2024-07-12 15:46:12.647586] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:52.408 [2024-07-12 15:46:12.713998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.979 15:46:13 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:52.979 15:46:13 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:09:52.979 15:46:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:09:52.979 15:46:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:09:52.979 15:46:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:09:52.979 15:46:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:09:52.979 15:46:13 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:09:52.979 15:46:13 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:52.979 15:46:13 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:52.979 15:46:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:52.979 ************************************ 00:09:52.979 START TEST accel_assign_opcode 00:09:52.979 ************************************ 00:09:52.979 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:09:52.979 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:53.238 [2024-07-12 15:46:13.432038] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.238 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:53.239 [2024-07-12 15:46:13.444065] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:53.239 software 00:09:53.239 00:09:53.239 real 0m0.221s 00:09:53.239 user 0m0.050s 00:09:53.239 sys 0m0.009s 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:53.239 15:46:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:09:53.239 ************************************ 00:09:53.239 END TEST accel_assign_opcode 00:09:53.239 ************************************ 00:09:53.239 15:46:13 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:09:53.239 15:46:13 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2475831 00:09:53.239 15:46:13 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 2475831 ']' 00:09:53.239 15:46:13 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 2475831 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2475831 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2475831' 00:09:53.499 killing process with pid 2475831 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@967 -- # kill 2475831 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@972 -- # wait 2475831 00:09:53.499 00:09:53.499 real 0m1.546s 00:09:53.499 user 0m1.654s 00:09:53.499 sys 0m0.438s 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:53.499 15:46:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:53.499 ************************************ 00:09:53.499 END TEST accel_rpc 00:09:53.499 ************************************ 00:09:53.761 15:46:13 -- common/autotest_common.sh@1142 -- # return 0 00:09:53.761 15:46:13 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:53.761 15:46:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:53.761 15:46:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:53.761 15:46:13 -- common/autotest_common.sh@10 -- # set +x 00:09:53.761 ************************************ 00:09:53.761 START TEST app_cmdline 00:09:53.761 ************************************ 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:09:53.761 * Looking for test storage... 00:09:53.761 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:53.761 15:46:14 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:09:53.761 15:46:14 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2476209 00:09:53.761 15:46:14 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2476209 00:09:53.761 15:46:14 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 2476209 ']' 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:53.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:53.761 15:46:14 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:53.761 [2024-07-12 15:46:14.197926] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:53.761 [2024-07-12 15:46:14.197989] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2476209 ] 00:09:54.021 [2024-07-12 15:46:14.288935] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:54.021 [2024-07-12 15:46:14.356998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.591 15:46:15 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:54.591 15:46:15 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:09:54.591 15:46:15 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:54.852 { 00:09:54.852 "version": "SPDK v24.09-pre git sha1 be7837808", 00:09:54.852 "fields": { 00:09:54.852 "major": 24, 00:09:54.852 "minor": 9, 00:09:54.852 "patch": 0, 00:09:54.852 "suffix": "-pre", 00:09:54.852 "commit": "be7837808" 00:09:54.852 } 00:09:54.852 } 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@26 -- # sort 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:54.852 15:46:15 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:54.852 15:46:15 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:55.114 request: 00:09:55.114 { 00:09:55.114 "method": "env_dpdk_get_mem_stats", 00:09:55.114 "req_id": 1 00:09:55.114 } 00:09:55.114 Got JSON-RPC error response 00:09:55.114 response: 00:09:55.114 { 00:09:55.114 "code": -32601, 00:09:55.114 "message": "Method not found" 00:09:55.114 } 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:55.114 15:46:15 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2476209 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 2476209 ']' 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 2476209 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2476209 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2476209' 00:09:55.114 killing process with pid 2476209 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@967 -- # kill 2476209 00:09:55.114 15:46:15 app_cmdline -- common/autotest_common.sh@972 -- # wait 2476209 00:09:55.374 00:09:55.374 real 0m1.673s 00:09:55.374 user 0m2.065s 00:09:55.374 sys 0m0.442s 00:09:55.374 15:46:15 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.374 15:46:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:55.374 ************************************ 00:09:55.374 END TEST app_cmdline 00:09:55.374 ************************************ 00:09:55.374 15:46:15 -- common/autotest_common.sh@1142 -- # return 0 00:09:55.374 15:46:15 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:55.374 15:46:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:55.374 15:46:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.374 15:46:15 -- common/autotest_common.sh@10 -- # set +x 00:09:55.374 ************************************ 00:09:55.374 START TEST version 00:09:55.374 ************************************ 00:09:55.374 15:46:15 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:55.635 * Looking for test storage... 00:09:55.635 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:55.635 15:46:15 version -- app/version.sh@17 -- # get_header_version major 00:09:55.635 15:46:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:55.635 15:46:15 version -- app/version.sh@14 -- # cut -f2 00:09:55.635 15:46:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:55.635 15:46:15 version -- app/version.sh@17 -- # major=24 00:09:55.635 15:46:15 version -- app/version.sh@18 -- # get_header_version minor 00:09:55.635 15:46:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:55.635 15:46:15 version -- app/version.sh@14 -- # cut -f2 00:09:55.635 15:46:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:55.635 15:46:15 version -- app/version.sh@18 -- # minor=9 00:09:55.635 15:46:15 version -- app/version.sh@19 -- # get_header_version patch 00:09:55.635 15:46:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:55.635 15:46:15 version -- app/version.sh@14 -- # cut -f2 00:09:55.636 15:46:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:55.636 15:46:15 version -- app/version.sh@19 -- # patch=0 00:09:55.636 15:46:15 version -- app/version.sh@20 -- # get_header_version suffix 00:09:55.636 15:46:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:55.636 15:46:15 version -- app/version.sh@14 -- # cut -f2 00:09:55.636 15:46:15 version -- app/version.sh@14 -- # tr -d '"' 00:09:55.636 15:46:15 version -- app/version.sh@20 -- # suffix=-pre 00:09:55.636 15:46:15 version -- app/version.sh@22 -- # version=24.9 00:09:55.636 15:46:15 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:55.636 15:46:15 version -- app/version.sh@28 -- # version=24.9rc0 00:09:55.636 15:46:15 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:55.636 15:46:15 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:55.636 15:46:15 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:55.636 15:46:15 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:55.636 00:09:55.636 real 0m0.172s 00:09:55.636 user 0m0.086s 00:09:55.636 sys 0m0.123s 00:09:55.636 15:46:15 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:55.636 15:46:15 version -- common/autotest_common.sh@10 -- # set +x 00:09:55.636 ************************************ 00:09:55.636 END TEST version 00:09:55.636 ************************************ 00:09:55.636 15:46:15 -- common/autotest_common.sh@1142 -- # return 0 00:09:55.636 15:46:15 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:09:55.636 15:46:15 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:55.636 15:46:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:55.636 15:46:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:55.636 15:46:15 -- common/autotest_common.sh@10 -- # set +x 00:09:55.636 ************************************ 00:09:55.636 START TEST blockdev_general 00:09:55.636 ************************************ 00:09:55.636 15:46:16 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:55.896 * Looking for test storage... 00:09:55.896 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:55.896 15:46:16 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2476638 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:55.896 15:46:16 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2476638 00:09:55.896 15:46:16 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 2476638 ']' 00:09:55.897 15:46:16 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:55.897 15:46:16 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:55.897 15:46:16 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:55.897 15:46:16 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:55.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:55.897 15:46:16 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:55.897 15:46:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:55.897 [2024-07-12 15:46:16.195543] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:55.897 [2024-07-12 15:46:16.195596] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2476638 ] 00:09:55.897 [2024-07-12 15:46:16.285299] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:56.157 [2024-07-12 15:46:16.350667] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:56.727 15:46:17 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:56.727 15:46:17 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:09:56.727 15:46:17 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:09:56.727 15:46:17 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:09:56.727 15:46:17 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:56.727 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.727 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:56.988 [2024-07-12 15:46:17.198404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:56.988 [2024-07-12 15:46:17.198444] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:56.988 00:09:56.988 [2024-07-12 15:46:17.206398] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:56.988 [2024-07-12 15:46:17.206415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:56.988 00:09:56.988 Malloc0 00:09:56.988 Malloc1 00:09:56.988 Malloc2 00:09:56.988 Malloc3 00:09:56.988 Malloc4 00:09:56.988 Malloc5 00:09:56.988 Malloc6 00:09:56.988 Malloc7 00:09:56.988 Malloc8 00:09:56.988 Malloc9 00:09:56.988 [2024-07-12 15:46:17.314866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:56.988 [2024-07-12 15:46:17.314900] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:56.988 [2024-07-12 15:46:17.314911] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1685160 00:09:56.988 [2024-07-12 15:46:17.314918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:56.988 [2024-07-12 15:46:17.316035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:56.988 [2024-07-12 15:46:17.316053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:56.988 TestPT 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:56.988 5000+0 records in 00:09:56.988 5000+0 records out 00:09:56.988 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0168624 s, 607 MB/s 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:56.988 AIO0 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:56.988 15:46:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:56.988 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:57.250 15:46:17 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:09:57.250 15:46:17 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:09:57.251 15:46:17 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "df20144b-3412-42df-b0e3-a6af0faea199"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df20144b-3412-42df-b0e3-a6af0faea199",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "841eae4f-f681-5dc2-80e4-fbc2f0926fd8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "841eae4f-f681-5dc2-80e4-fbc2f0926fd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "9dfed297-4dfd-5dd2-8b01-7b41830cc24a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "9dfed297-4dfd-5dd2-8b01-7b41830cc24a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "909dd3f0-2477-52f2-8ae6-818f33597236"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "909dd3f0-2477-52f2-8ae6-818f33597236",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b3f9ac83-d266-578c-b07a-2de2df9b6868"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3f9ac83-d266-578c-b07a-2de2df9b6868",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d73479a9-a4d1-5868-8818-6107f497c773"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d73479a9-a4d1-5868-8818-6107f497c773",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "11324ce1-5b36-5e09-b448-f90225da1620"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11324ce1-5b36-5e09-b448-f90225da1620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "89de5192-fccc-5200-a3b2-8989327a8d2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89de5192-fccc-5200-a3b2-8989327a8d2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "13ca2c17-e4bb-518d-bfd9-0d303997b38f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "13ca2c17-e4bb-518d-bfd9-0d303997b38f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0ebec7f2-d311-523c-a6da-89849c326a8f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0ebec7f2-d311-523c-a6da-89849c326a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "e84df681-2a24-4f3e-a4fa-c4dd1c22c930",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e13c88a1-1488-4044-aace-f91811607984",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fc3cae4a-85a6-45d0-8cef-4e59d486842c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "27e4fad7-48be-4bd4-9bc2-d5e03ee46a61",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "22a4b234-e34c-4e10-bb04-a0a979a5a1d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "e16fa3a4-852b-471a-b21a-6117bfe5373a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ad0cc246-e7c3-40b7-a94b-ecd40f190cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "70e7f65d-f9cc-4b17-a156-fbecc61f66dd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "cc549ac3-5341-485d-944a-6a87e887dda7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "cc549ac3-5341-485d-944a-6a87e887dda7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:57.251 15:46:17 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:09:57.251 15:46:17 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:09:57.251 15:46:17 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:09:57.251 15:46:17 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2476638 00:09:57.251 15:46:17 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 2476638 ']' 00:09:57.251 15:46:17 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 2476638 00:09:57.251 15:46:17 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:09:57.251 15:46:17 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:57.251 15:46:17 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2476638 00:09:57.512 15:46:17 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:57.512 15:46:17 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:57.512 15:46:17 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2476638' 00:09:57.512 killing process with pid 2476638 00:09:57.512 15:46:17 blockdev_general -- common/autotest_common.sh@967 -- # kill 2476638 00:09:57.512 15:46:17 blockdev_general -- common/autotest_common.sh@972 -- # wait 2476638 00:09:57.773 15:46:17 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:57.773 15:46:17 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:57.773 15:46:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:57.773 15:46:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:57.773 15:46:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:57.773 ************************************ 00:09:57.773 START TEST bdev_hello_world 00:09:57.773 ************************************ 00:09:57.773 15:46:17 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:57.773 [2024-07-12 15:46:18.058156] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:57.773 [2024-07-12 15:46:18.058202] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2476969 ] 00:09:57.773 [2024-07-12 15:46:18.144567] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:57.773 [2024-07-12 15:46:18.208069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.033 [2024-07-12 15:46:18.326165] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:58.033 [2024-07-12 15:46:18.326209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:58.033 [2024-07-12 15:46:18.326218] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:58.033 [2024-07-12 15:46:18.334170] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:58.033 [2024-07-12 15:46:18.334188] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:58.033 [2024-07-12 15:46:18.342195] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:58.033 [2024-07-12 15:46:18.342211] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:58.033 [2024-07-12 15:46:18.403117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:58.033 [2024-07-12 15:46:18.403155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:58.033 [2024-07-12 15:46:18.403165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x187e730 00:09:58.033 [2024-07-12 15:46:18.403171] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:58.033 [2024-07-12 15:46:18.404349] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:58.033 [2024-07-12 15:46:18.404370] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:58.292 [2024-07-12 15:46:18.538058] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:58.292 [2024-07-12 15:46:18.538100] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:58.292 [2024-07-12 15:46:18.538130] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:58.292 [2024-07-12 15:46:18.538175] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:58.292 [2024-07-12 15:46:18.538223] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:58.292 [2024-07-12 15:46:18.538238] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:58.292 [2024-07-12 15:46:18.538274] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:58.292 00:09:58.292 [2024-07-12 15:46:18.538293] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:58.292 00:09:58.292 real 0m0.717s 00:09:58.292 user 0m0.486s 00:09:58.292 sys 0m0.188s 00:09:58.292 15:46:18 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.292 15:46:18 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:58.292 ************************************ 00:09:58.292 END TEST bdev_hello_world 00:09:58.292 ************************************ 00:09:58.552 15:46:18 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:58.552 15:46:18 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:09:58.552 15:46:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:58.552 15:46:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:58.552 15:46:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:58.552 ************************************ 00:09:58.552 START TEST bdev_bounds 00:09:58.552 ************************************ 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2477183 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2477183' 00:09:58.552 Process bdevio pid: 2477183 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2477183 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2477183 ']' 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:58.552 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:58.552 15:46:18 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:58.552 [2024-07-12 15:46:18.859051] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:09:58.552 [2024-07-12 15:46:18.859105] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2477183 ] 00:09:58.552 [2024-07-12 15:46:18.948384] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:58.811 [2024-07-12 15:46:19.018214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:58.811 [2024-07-12 15:46:19.018357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.811 [2024-07-12 15:46:19.018358] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:58.811 [2024-07-12 15:46:19.136572] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:58.811 [2024-07-12 15:46:19.136608] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:58.811 [2024-07-12 15:46:19.136616] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:58.811 [2024-07-12 15:46:19.144582] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:58.811 [2024-07-12 15:46:19.144601] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:58.811 [2024-07-12 15:46:19.152596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:58.811 [2024-07-12 15:46:19.152612] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:58.811 [2024-07-12 15:46:19.213429] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:58.811 [2024-07-12 15:46:19.213465] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:58.811 [2024-07-12 15:46:19.213476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2a75ff0 00:09:58.811 [2024-07-12 15:46:19.213482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:58.811 [2024-07-12 15:46:19.214784] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:58.811 [2024-07-12 15:46:19.214804] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:59.378 15:46:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:59.378 15:46:19 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:09:59.378 15:46:19 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:59.378 I/O targets: 00:09:59.378 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:59.378 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:59.378 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:59.378 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:59.378 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:59.378 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:59.378 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:59.378 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:59.378 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:59.378 00:09:59.378 00:09:59.378 CUnit - A unit testing framework for C - Version 2.1-3 00:09:59.378 http://cunit.sourceforge.net/ 00:09:59.378 00:09:59.378 00:09:59.378 Suite: bdevio tests on: AIO0 00:09:59.378 Test: blockdev write read block ...passed 00:09:59.378 Test: blockdev write zeroes read block ...passed 00:09:59.378 Test: blockdev write zeroes read no split ...passed 00:09:59.378 Test: blockdev write zeroes read split ...passed 00:09:59.378 Test: blockdev write zeroes read split partial ...passed 00:09:59.378 Test: blockdev reset ...passed 00:09:59.378 Test: blockdev write read 8 blocks ...passed 00:09:59.378 Test: blockdev write read size > 128k ...passed 00:09:59.378 Test: blockdev write read invalid size ...passed 00:09:59.378 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.378 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.378 Test: blockdev write read max offset ...passed 00:09:59.378 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.378 Test: blockdev writev readv 8 blocks ...passed 00:09:59.378 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.378 Test: blockdev writev readv block ...passed 00:09:59.378 Test: blockdev writev readv size > 128k ...passed 00:09:59.378 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.378 Test: blockdev comparev and writev ...passed 00:09:59.378 Test: blockdev nvme passthru rw ...passed 00:09:59.378 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.378 Test: blockdev nvme admin passthru ...passed 00:09:59.378 Test: blockdev copy ...passed 00:09:59.378 Suite: bdevio tests on: raid1 00:09:59.378 Test: blockdev write read block ...passed 00:09:59.378 Test: blockdev write zeroes read block ...passed 00:09:59.378 Test: blockdev write zeroes read no split ...passed 00:09:59.378 Test: blockdev write zeroes read split ...passed 00:09:59.378 Test: blockdev write zeroes read split partial ...passed 00:09:59.378 Test: blockdev reset ...passed 00:09:59.378 Test: blockdev write read 8 blocks ...passed 00:09:59.378 Test: blockdev write read size > 128k ...passed 00:09:59.378 Test: blockdev write read invalid size ...passed 00:09:59.378 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.378 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.378 Test: blockdev write read max offset ...passed 00:09:59.378 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.378 Test: blockdev writev readv 8 blocks ...passed 00:09:59.378 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.378 Test: blockdev writev readv block ...passed 00:09:59.378 Test: blockdev writev readv size > 128k ...passed 00:09:59.378 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.378 Test: blockdev comparev and writev ...passed 00:09:59.378 Test: blockdev nvme passthru rw ...passed 00:09:59.378 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.378 Test: blockdev nvme admin passthru ...passed 00:09:59.378 Test: blockdev copy ...passed 00:09:59.378 Suite: bdevio tests on: concat0 00:09:59.378 Test: blockdev write read block ...passed 00:09:59.378 Test: blockdev write zeroes read block ...passed 00:09:59.378 Test: blockdev write zeroes read no split ...passed 00:09:59.378 Test: blockdev write zeroes read split ...passed 00:09:59.674 Test: blockdev write zeroes read split partial ...passed 00:09:59.674 Test: blockdev reset ...passed 00:09:59.674 Test: blockdev write read 8 blocks ...passed 00:09:59.674 Test: blockdev write read size > 128k ...passed 00:09:59.674 Test: blockdev write read invalid size ...passed 00:09:59.674 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.674 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.674 Test: blockdev write read max offset ...passed 00:09:59.674 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.674 Test: blockdev writev readv 8 blocks ...passed 00:09:59.674 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.674 Test: blockdev writev readv block ...passed 00:09:59.674 Test: blockdev writev readv size > 128k ...passed 00:09:59.674 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.674 Test: blockdev comparev and writev ...passed 00:09:59.674 Test: blockdev nvme passthru rw ...passed 00:09:59.674 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.674 Test: blockdev nvme admin passthru ...passed 00:09:59.674 Test: blockdev copy ...passed 00:09:59.674 Suite: bdevio tests on: raid0 00:09:59.674 Test: blockdev write read block ...passed 00:09:59.674 Test: blockdev write zeroes read block ...passed 00:09:59.674 Test: blockdev write zeroes read no split ...passed 00:09:59.674 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: TestPT 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p7 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p6 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p5 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p4 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p3 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.675 Test: blockdev write zeroes read split ...passed 00:09:59.675 Test: blockdev write zeroes read split partial ...passed 00:09:59.675 Test: blockdev reset ...passed 00:09:59.675 Test: blockdev write read 8 blocks ...passed 00:09:59.675 Test: blockdev write read size > 128k ...passed 00:09:59.675 Test: blockdev write read invalid size ...passed 00:09:59.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.675 Test: blockdev write read max offset ...passed 00:09:59.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.675 Test: blockdev writev readv 8 blocks ...passed 00:09:59.675 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.675 Test: blockdev writev readv block ...passed 00:09:59.675 Test: blockdev writev readv size > 128k ...passed 00:09:59.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.675 Test: blockdev comparev and writev ...passed 00:09:59.675 Test: blockdev nvme passthru rw ...passed 00:09:59.675 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.675 Test: blockdev nvme admin passthru ...passed 00:09:59.675 Test: blockdev copy ...passed 00:09:59.675 Suite: bdevio tests on: Malloc2p2 00:09:59.675 Test: blockdev write read block ...passed 00:09:59.675 Test: blockdev write zeroes read block ...passed 00:09:59.675 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 Suite: bdevio tests on: Malloc2p1 00:09:59.676 Test: blockdev write read block ...passed 00:09:59.676 Test: blockdev write zeroes read block ...passed 00:09:59.676 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 Suite: bdevio tests on: Malloc2p0 00:09:59.676 Test: blockdev write read block ...passed 00:09:59.676 Test: blockdev write zeroes read block ...passed 00:09:59.676 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 Suite: bdevio tests on: Malloc1p1 00:09:59.676 Test: blockdev write read block ...passed 00:09:59.676 Test: blockdev write zeroes read block ...passed 00:09:59.676 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 Suite: bdevio tests on: Malloc1p0 00:09:59.676 Test: blockdev write read block ...passed 00:09:59.676 Test: blockdev write zeroes read block ...passed 00:09:59.676 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 Suite: bdevio tests on: Malloc0 00:09:59.676 Test: blockdev write read block ...passed 00:09:59.676 Test: blockdev write zeroes read block ...passed 00:09:59.676 Test: blockdev write zeroes read no split ...passed 00:09:59.676 Test: blockdev write zeroes read split ...passed 00:09:59.676 Test: blockdev write zeroes read split partial ...passed 00:09:59.676 Test: blockdev reset ...passed 00:09:59.676 Test: blockdev write read 8 blocks ...passed 00:09:59.676 Test: blockdev write read size > 128k ...passed 00:09:59.676 Test: blockdev write read invalid size ...passed 00:09:59.676 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:59.676 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:59.676 Test: blockdev write read max offset ...passed 00:09:59.676 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:59.676 Test: blockdev writev readv 8 blocks ...passed 00:09:59.676 Test: blockdev writev readv 30 x 1block ...passed 00:09:59.676 Test: blockdev writev readv block ...passed 00:09:59.676 Test: blockdev writev readv size > 128k ...passed 00:09:59.676 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:59.676 Test: blockdev comparev and writev ...passed 00:09:59.676 Test: blockdev nvme passthru rw ...passed 00:09:59.676 Test: blockdev nvme passthru vendor specific ...passed 00:09:59.676 Test: blockdev nvme admin passthru ...passed 00:09:59.676 Test: blockdev copy ...passed 00:09:59.676 00:09:59.676 Run Summary: Type Total Ran Passed Failed Inactive 00:09:59.676 suites 16 16 n/a 0 0 00:09:59.676 tests 368 368 368 0 0 00:09:59.676 asserts 2224 2224 2224 0 n/a 00:09:59.676 00:09:59.676 Elapsed time = 0.633 seconds 00:09:59.676 0 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2477183 ']' 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2477183' 00:09:59.969 killing process with pid 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2477183 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:09:59.969 00:09:59.969 real 0m1.530s 00:09:59.969 user 0m3.923s 00:09:59.969 sys 0m0.342s 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:59.969 15:46:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:59.969 ************************************ 00:09:59.969 END TEST bdev_bounds 00:09:59.969 ************************************ 00:09:59.969 15:46:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:09:59.969 15:46:20 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:59.969 15:46:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:09:59.969 15:46:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:59.969 15:46:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:59.969 ************************************ 00:09:59.969 START TEST bdev_nbd 00:09:59.969 ************************************ 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2477408 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2477408 /var/tmp/spdk-nbd.sock 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2477408 ']' 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:59.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:59.969 15:46:20 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:00.227 [2024-07-12 15:46:20.456438] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:00.227 [2024-07-12 15:46:20.456491] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:00.227 [2024-07-12 15:46:20.548212] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:00.228 [2024-07-12 15:46:20.625139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.485 [2024-07-12 15:46:20.744090] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:00.485 [2024-07-12 15:46:20.744131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:00.485 [2024-07-12 15:46:20.744140] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:00.486 [2024-07-12 15:46:20.752097] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:00.486 [2024-07-12 15:46:20.752116] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:00.486 [2024-07-12 15:46:20.760111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:00.486 [2024-07-12 15:46:20.760127] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:00.486 [2024-07-12 15:46:20.820900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:00.486 [2024-07-12 15:46:20.820938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:00.486 [2024-07-12 15:46:20.820949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21df930 00:10:00.486 [2024-07-12 15:46:20.820955] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:00.486 [2024-07-12 15:46:20.822153] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:00.486 [2024-07-12 15:46:20.822174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:10:01.053 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:01.312 1+0 records in 00:10:01.312 1+0 records out 00:10:01.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296712 s, 13.8 MB/s 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:01.312 1+0 records in 00:10:01.312 1+0 records out 00:10:01.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271613 s, 15.1 MB/s 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:01.312 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:01.571 1+0 records in 00:10:01.571 1+0 records out 00:10:01.571 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342298 s, 12.0 MB/s 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:01.571 15:46:21 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:01.830 1+0 records in 00:10:01.830 1+0 records out 00:10:01.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292526 s, 14.0 MB/s 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:01.830 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:02.089 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:02.090 1+0 records in 00:10:02.090 1+0 records out 00:10:02.090 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308684 s, 13.3 MB/s 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:02.090 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:02.349 1+0 records in 00:10:02.349 1+0 records out 00:10:02.349 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315032 s, 13.0 MB/s 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:02.349 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:02.609 1+0 records in 00:10:02.609 1+0 records out 00:10:02.609 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000339921 s, 12.0 MB/s 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:02.609 15:46:22 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:02.868 1+0 records in 00:10:02.868 1+0 records out 00:10:02.868 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287487 s, 14.2 MB/s 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:02.868 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:03.127 1+0 records in 00:10:03.127 1+0 records out 00:10:03.127 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312742 s, 13.1 MB/s 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:03.127 1+0 records in 00:10:03.127 1+0 records out 00:10:03.127 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000342711 s, 12.0 MB/s 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:03.127 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:03.388 1+0 records in 00:10:03.388 1+0 records out 00:10:03.388 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000322104 s, 12.7 MB/s 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:03.388 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:03.648 15:46:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:03.648 1+0 records in 00:10:03.648 1+0 records out 00:10:03.648 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000357403 s, 11.5 MB/s 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:03.648 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:03.907 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:03.908 1+0 records in 00:10:03.908 1+0 records out 00:10:03.908 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000441324 s, 9.3 MB/s 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:03.908 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:04.168 1+0 records in 00:10:04.168 1+0 records out 00:10:04.168 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000451001 s, 9.1 MB/s 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:04.168 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:04.427 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:04.428 1+0 records in 00:10:04.428 1+0 records out 00:10:04.428 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000456203 s, 9.0 MB/s 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:04.428 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:04.688 1+0 records in 00:10:04.688 1+0 records out 00:10:04.688 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00049728 s, 8.2 MB/s 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:04.688 15:46:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:04.688 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd0", 00:10:04.688 "bdev_name": "Malloc0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd1", 00:10:04.688 "bdev_name": "Malloc1p0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd2", 00:10:04.688 "bdev_name": "Malloc1p1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd3", 00:10:04.688 "bdev_name": "Malloc2p0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd4", 00:10:04.688 "bdev_name": "Malloc2p1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd5", 00:10:04.688 "bdev_name": "Malloc2p2" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd6", 00:10:04.688 "bdev_name": "Malloc2p3" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd7", 00:10:04.688 "bdev_name": "Malloc2p4" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd8", 00:10:04.688 "bdev_name": "Malloc2p5" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd9", 00:10:04.688 "bdev_name": "Malloc2p6" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd10", 00:10:04.688 "bdev_name": "Malloc2p7" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd11", 00:10:04.688 "bdev_name": "TestPT" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd12", 00:10:04.688 "bdev_name": "raid0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd13", 00:10:04.688 "bdev_name": "concat0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd14", 00:10:04.688 "bdev_name": "raid1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd15", 00:10:04.688 "bdev_name": "AIO0" 00:10:04.688 } 00:10:04.688 ]' 00:10:04.688 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:10:04.688 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd0", 00:10:04.688 "bdev_name": "Malloc0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd1", 00:10:04.688 "bdev_name": "Malloc1p0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd2", 00:10:04.688 "bdev_name": "Malloc1p1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd3", 00:10:04.688 "bdev_name": "Malloc2p0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd4", 00:10:04.688 "bdev_name": "Malloc2p1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd5", 00:10:04.688 "bdev_name": "Malloc2p2" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd6", 00:10:04.688 "bdev_name": "Malloc2p3" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd7", 00:10:04.688 "bdev_name": "Malloc2p4" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd8", 00:10:04.688 "bdev_name": "Malloc2p5" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd9", 00:10:04.688 "bdev_name": "Malloc2p6" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd10", 00:10:04.688 "bdev_name": "Malloc2p7" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd11", 00:10:04.688 "bdev_name": "TestPT" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd12", 00:10:04.688 "bdev_name": "raid0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd13", 00:10:04.688 "bdev_name": "concat0" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd14", 00:10:04.688 "bdev_name": "raid1" 00:10:04.688 }, 00:10:04.688 { 00:10:04.688 "nbd_device": "/dev/nbd15", 00:10:04.688 "bdev_name": "AIO0" 00:10:04.688 } 00:10:04.688 ]' 00:10:04.688 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:04.949 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.209 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.469 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:05.728 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:05.728 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:05.728 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:05.728 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.728 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.729 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:05.729 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.729 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.729 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.729 15:46:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.729 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:05.988 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:06.247 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.248 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.507 15:46:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:06.767 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.027 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:07.287 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.547 15:46:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:07.806 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:08.066 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:08.326 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:08.326 /dev/nbd0 00:10:08.586 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:08.587 1+0 records in 00:10:08.587 1+0 records out 00:10:08.587 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260696 s, 15.7 MB/s 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:08.587 15:46:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:10:08.587 /dev/nbd1 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:08.587 1+0 records in 00:10:08.587 1+0 records out 00:10:08.587 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272549 s, 15.0 MB/s 00:10:08.587 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.846 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:08.846 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.846 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:08.846 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:10:08.847 /dev/nbd10 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:08.847 1+0 records in 00:10:08.847 1+0 records out 00:10:08.847 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304284 s, 13.5 MB/s 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:08.847 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:10:09.107 /dev/nbd11 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:09.107 1+0 records in 00:10:09.107 1+0 records out 00:10:09.107 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260337 s, 15.7 MB/s 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:09.107 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:10:09.367 /dev/nbd12 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:09.367 1+0 records in 00:10:09.367 1+0 records out 00:10:09.367 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000344874 s, 11.9 MB/s 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:09.367 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:10:09.628 /dev/nbd13 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:09.628 1+0 records in 00:10:09.628 1+0 records out 00:10:09.628 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028347 s, 14.4 MB/s 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:09.628 15:46:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:10:09.888 /dev/nbd14 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:09.888 1+0 records in 00:10:09.888 1+0 records out 00:10:09.888 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299968 s, 13.7 MB/s 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:09.888 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:10:10.148 /dev/nbd15 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.148 1+0 records in 00:10:10.148 1+0 records out 00:10:10.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319934 s, 12.8 MB/s 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.148 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:10.149 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:10.149 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:10:10.409 /dev/nbd2 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.409 1+0 records in 00:10:10.409 1+0 records out 00:10:10.409 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301531 s, 13.6 MB/s 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:10.409 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:10:10.669 /dev/nbd3 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.669 1+0 records in 00:10:10.669 1+0 records out 00:10:10.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024538 s, 16.7 MB/s 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:10.669 15:46:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:10:10.669 /dev/nbd4 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.930 1+0 records in 00:10:10.930 1+0 records out 00:10:10.930 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000437788 s, 9.4 MB/s 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:10:10.930 /dev/nbd5 00:10:10.930 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.190 1+0 records in 00:10:11.190 1+0 records out 00:10:11.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000470212 s, 8.7 MB/s 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:11.190 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:10:11.191 /dev/nbd6 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.191 1+0 records in 00:10:11.191 1+0 records out 00:10:11.191 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000408948 s, 10.0 MB/s 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:11.191 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:10:11.451 /dev/nbd7 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.451 1+0 records in 00:10:11.451 1+0 records out 00:10:11.451 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000507338 s, 8.1 MB/s 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:11.451 15:46:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:10:11.711 /dev/nbd8 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.711 1+0 records in 00:10:11.711 1+0 records out 00:10:11.711 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000468663 s, 8.7 MB/s 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.711 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.712 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:11.712 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:11.712 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:10:11.971 /dev/nbd9 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.971 1+0 records in 00:10:11.971 1+0 records out 00:10:11.971 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000495179 s, 8.3 MB/s 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:11.971 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:12.230 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd0", 00:10:12.230 "bdev_name": "Malloc0" 00:10:12.230 }, 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd1", 00:10:12.230 "bdev_name": "Malloc1p0" 00:10:12.230 }, 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd10", 00:10:12.230 "bdev_name": "Malloc1p1" 00:10:12.230 }, 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd11", 00:10:12.230 "bdev_name": "Malloc2p0" 00:10:12.230 }, 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd12", 00:10:12.230 "bdev_name": "Malloc2p1" 00:10:12.230 }, 00:10:12.230 { 00:10:12.230 "nbd_device": "/dev/nbd13", 00:10:12.230 "bdev_name": "Malloc2p2" 00:10:12.230 }, 00:10:12.230 { 00:10:12.231 "nbd_device": "/dev/nbd14", 00:10:12.231 "bdev_name": "Malloc2p3" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd15", 00:10:12.231 "bdev_name": "Malloc2p4" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd2", 00:10:12.231 "bdev_name": "Malloc2p5" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd3", 00:10:12.231 "bdev_name": "Malloc2p6" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd4", 00:10:12.231 "bdev_name": "Malloc2p7" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd5", 00:10:12.231 "bdev_name": "TestPT" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd6", 00:10:12.231 "bdev_name": "raid0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd7", 00:10:12.231 "bdev_name": "concat0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd8", 00:10:12.231 "bdev_name": "raid1" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd9", 00:10:12.231 "bdev_name": "AIO0" 00:10:12.231 } 00:10:12.231 ]' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd0", 00:10:12.231 "bdev_name": "Malloc0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd1", 00:10:12.231 "bdev_name": "Malloc1p0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd10", 00:10:12.231 "bdev_name": "Malloc1p1" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd11", 00:10:12.231 "bdev_name": "Malloc2p0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd12", 00:10:12.231 "bdev_name": "Malloc2p1" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd13", 00:10:12.231 "bdev_name": "Malloc2p2" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd14", 00:10:12.231 "bdev_name": "Malloc2p3" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd15", 00:10:12.231 "bdev_name": "Malloc2p4" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd2", 00:10:12.231 "bdev_name": "Malloc2p5" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd3", 00:10:12.231 "bdev_name": "Malloc2p6" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd4", 00:10:12.231 "bdev_name": "Malloc2p7" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd5", 00:10:12.231 "bdev_name": "TestPT" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd6", 00:10:12.231 "bdev_name": "raid0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd7", 00:10:12.231 "bdev_name": "concat0" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd8", 00:10:12.231 "bdev_name": "raid1" 00:10:12.231 }, 00:10:12.231 { 00:10:12.231 "nbd_device": "/dev/nbd9", 00:10:12.231 "bdev_name": "AIO0" 00:10:12.231 } 00:10:12.231 ]' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:12.231 /dev/nbd1 00:10:12.231 /dev/nbd10 00:10:12.231 /dev/nbd11 00:10:12.231 /dev/nbd12 00:10:12.231 /dev/nbd13 00:10:12.231 /dev/nbd14 00:10:12.231 /dev/nbd15 00:10:12.231 /dev/nbd2 00:10:12.231 /dev/nbd3 00:10:12.231 /dev/nbd4 00:10:12.231 /dev/nbd5 00:10:12.231 /dev/nbd6 00:10:12.231 /dev/nbd7 00:10:12.231 /dev/nbd8 00:10:12.231 /dev/nbd9' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:12.231 /dev/nbd1 00:10:12.231 /dev/nbd10 00:10:12.231 /dev/nbd11 00:10:12.231 /dev/nbd12 00:10:12.231 /dev/nbd13 00:10:12.231 /dev/nbd14 00:10:12.231 /dev/nbd15 00:10:12.231 /dev/nbd2 00:10:12.231 /dev/nbd3 00:10:12.231 /dev/nbd4 00:10:12.231 /dev/nbd5 00:10:12.231 /dev/nbd6 00:10:12.231 /dev/nbd7 00:10:12.231 /dev/nbd8 00:10:12.231 /dev/nbd9' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:10:12.231 256+0 records in 00:10:12.231 256+0 records out 00:10:12.231 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.012558 s, 83.5 MB/s 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:12.231 256+0 records in 00:10:12.231 256+0 records out 00:10:12.231 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0873736 s, 12.0 MB/s 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.231 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:12.491 256+0 records in 00:10:12.491 256+0 records out 00:10:12.491 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0817552 s, 12.8 MB/s 00:10:12.491 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.491 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:10:12.491 256+0 records in 00:10:12.491 256+0 records out 00:10:12.491 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0869989 s, 12.1 MB/s 00:10:12.491 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.491 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:10:12.751 256+0 records in 00:10:12.751 256+0 records out 00:10:12.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0871507 s, 12.0 MB/s 00:10:12.751 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.751 15:46:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:10:12.751 256+0 records in 00:10:12.751 256+0 records out 00:10:12.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0864944 s, 12.1 MB/s 00:10:12.751 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.751 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:10:12.751 256+0 records in 00:10:12.751 256+0 records out 00:10:12.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0861624 s, 12.2 MB/s 00:10:12.751 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:12.751 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:10:13.010 256+0 records in 00:10:13.010 256+0 records out 00:10:13.010 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0854439 s, 12.3 MB/s 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:10:13.010 256+0 records in 00:10:13.010 256+0 records out 00:10:13.010 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0855957 s, 12.3 MB/s 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:10:13.010 256+0 records in 00:10:13.010 256+0 records out 00:10:13.010 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0854585 s, 12.3 MB/s 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.010 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:10:13.269 256+0 records in 00:10:13.269 256+0 records out 00:10:13.269 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.08906 s, 11.8 MB/s 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:10:13.269 256+0 records in 00:10:13.269 256+0 records out 00:10:13.269 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0871228 s, 12.0 MB/s 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:10:13.269 256+0 records in 00:10:13.269 256+0 records out 00:10:13.269 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0878746 s, 11.9 MB/s 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.269 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:10:13.530 256+0 records in 00:10:13.530 256+0 records out 00:10:13.530 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0895884 s, 11.7 MB/s 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:10:13.530 256+0 records in 00:10:13.530 256+0 records out 00:10:13.530 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0925109 s, 11.3 MB/s 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:10:13.530 256+0 records in 00:10:13.530 256+0 records out 00:10:13.530 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0895978 s, 11.7 MB/s 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:13.530 15:46:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:10:13.791 256+0 records in 00:10:13.791 256+0 records out 00:10:13.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0962684 s, 10.9 MB/s 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:13.791 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.051 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:14.311 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:14.311 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:14.311 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.312 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.572 15:46:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.832 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.093 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.354 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.614 15:46:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.186 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.447 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.708 15:46:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.280 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.540 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.541 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:17.541 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.541 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.541 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.541 15:46:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:17.801 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:10:18.081 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:10:18.390 malloc_lvol_verify 00:10:18.390 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:10:18.651 c7c51932-efb6-4f33-816c-42c2bd1174b6 00:10:18.651 15:46:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:10:18.651 986809ff-8aa7-47e1-91e4-fd10ac32eee3 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:10:18.911 /dev/nbd0 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:10:18.911 mke2fs 1.46.5 (30-Dec-2021) 00:10:18.911 Discarding device blocks: 0/4096 done 00:10:18.911 Creating filesystem with 4096 1k blocks and 1024 inodes 00:10:18.911 00:10:18.911 Allocating group tables: 0/1 done 00:10:18.911 Writing inode tables: 0/1 done 00:10:18.911 Creating journal (1024 blocks): done 00:10:18.911 Writing superblocks and filesystem accounting information: 0/1 done 00:10:18.911 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.911 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2477408 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2477408 ']' 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2477408 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2477408 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2477408' 00:10:19.172 killing process with pid 2477408 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2477408 00:10:19.172 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2477408 00:10:19.433 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:10:19.433 00:10:19.433 real 0m19.438s 00:10:19.433 user 0m27.301s 00:10:19.433 sys 0m7.977s 00:10:19.433 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:19.433 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:19.433 ************************************ 00:10:19.433 END TEST bdev_nbd 00:10:19.433 ************************************ 00:10:19.433 15:46:39 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:19.433 15:46:39 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:10:19.433 15:46:39 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:10:19.433 15:46:39 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:10:19.433 15:46:39 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:10:19.433 15:46:39 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:19.433 15:46:39 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:19.433 15:46:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:19.694 ************************************ 00:10:19.694 START TEST bdev_fio 00:10:19.694 ************************************ 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:19.694 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:10:19.694 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:19.695 15:46:39 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:19.695 ************************************ 00:10:19.695 START TEST bdev_fio_rw_verify 00:10:19.695 ************************************ 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:19.695 15:46:40 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:20.263 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.263 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.263 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.263 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:20.264 fio-3.35 00:10:20.264 Starting 16 threads 00:10:32.491 00:10:32.491 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2481648: Fri Jul 12 15:46:51 2024 00:10:32.491 read: IOPS=103k, BW=404MiB/s (424MB/s)(4040MiB/10001msec) 00:10:32.491 slat (usec): min=2, max=442, avg=28.98, stdev=20.39 00:10:32.491 clat (usec): min=5, max=1629, avg=250.39, stdev=156.75 00:10:32.491 lat (usec): min=9, max=1741, avg=279.37, stdev=168.40 00:10:32.491 clat percentiles (usec): 00:10:32.491 | 50.000th=[ 239], 99.000th=[ 717], 99.900th=[ 955], 99.990th=[ 1221], 00:10:32.491 | 99.999th=[ 1532] 00:10:32.491 write: IOPS=161k, BW=629MiB/s (659MB/s)(6225MiB/9900msec); 0 zone resets 00:10:32.491 slat (usec): min=3, max=3855, avg=42.94, stdev=24.54 00:10:32.491 clat (usec): min=6, max=2787, avg=308.60, stdev=187.65 00:10:32.491 lat (usec): min=18, max=4488, avg=351.53, stdev=202.49 00:10:32.491 clat percentiles (usec): 00:10:32.491 | 50.000th=[ 289], 99.000th=[ 938], 99.900th=[ 1434], 99.990th=[ 1614], 00:10:32.491 | 99.999th=[ 1745] 00:10:32.491 bw ( KiB/s): min=464128, max=766576, per=98.61%, avg=634890.84, stdev=6689.17, samples=304 00:10:32.491 iops : min=116032, max=191644, avg=158722.53, stdev=1672.31, samples=304 00:10:32.491 lat (usec) : 10=0.01%, 20=0.30%, 50=4.00%, 100=10.69%, 250=31.26% 00:10:32.491 lat (usec) : 500=43.83%, 750=8.16%, 1000=1.26% 00:10:32.491 lat (msec) : 2=0.50%, 4=0.01% 00:10:32.491 cpu : usr=99.33%, sys=0.27%, ctx=606, majf=0, minf=2660 00:10:32.491 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:32.491 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:32.491 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:32.491 issued rwts: total=1034260,1593488,0,0 short=0,0,0,0 dropped=0,0,0,0 00:10:32.491 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:32.491 00:10:32.491 Run status group 0 (all jobs): 00:10:32.491 READ: bw=404MiB/s (424MB/s), 404MiB/s-404MiB/s (424MB/s-424MB/s), io=4040MiB (4236MB), run=10001-10001msec 00:10:32.491 WRITE: bw=629MiB/s (659MB/s), 629MiB/s-629MiB/s (659MB/s-659MB/s), io=6225MiB (6527MB), run=9900-9900msec 00:10:32.491 00:10:32.491 real 0m11.830s 00:10:32.491 user 2m48.049s 00:10:32.491 sys 0m1.647s 00:10:32.491 15:46:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:32.491 15:46:51 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:10:32.491 ************************************ 00:10:32.491 END TEST bdev_fio_rw_verify 00:10:32.491 ************************************ 00:10:32.491 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:32.491 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:10:32.492 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:32.493 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "df20144b-3412-42df-b0e3-a6af0faea199"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df20144b-3412-42df-b0e3-a6af0faea199",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "841eae4f-f681-5dc2-80e4-fbc2f0926fd8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "841eae4f-f681-5dc2-80e4-fbc2f0926fd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "9dfed297-4dfd-5dd2-8b01-7b41830cc24a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "9dfed297-4dfd-5dd2-8b01-7b41830cc24a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "909dd3f0-2477-52f2-8ae6-818f33597236"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "909dd3f0-2477-52f2-8ae6-818f33597236",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b3f9ac83-d266-578c-b07a-2de2df9b6868"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3f9ac83-d266-578c-b07a-2de2df9b6868",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d73479a9-a4d1-5868-8818-6107f497c773"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d73479a9-a4d1-5868-8818-6107f497c773",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "11324ce1-5b36-5e09-b448-f90225da1620"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11324ce1-5b36-5e09-b448-f90225da1620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "89de5192-fccc-5200-a3b2-8989327a8d2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89de5192-fccc-5200-a3b2-8989327a8d2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "13ca2c17-e4bb-518d-bfd9-0d303997b38f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "13ca2c17-e4bb-518d-bfd9-0d303997b38f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0ebec7f2-d311-523c-a6da-89849c326a8f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0ebec7f2-d311-523c-a6da-89849c326a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "e84df681-2a24-4f3e-a4fa-c4dd1c22c930",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e13c88a1-1488-4044-aace-f91811607984",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fc3cae4a-85a6-45d0-8cef-4e59d486842c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "27e4fad7-48be-4bd4-9bc2-d5e03ee46a61",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "22a4b234-e34c-4e10-bb04-a0a979a5a1d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "e16fa3a4-852b-471a-b21a-6117bfe5373a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ad0cc246-e7c3-40b7-a94b-ecd40f190cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "70e7f65d-f9cc-4b17-a156-fbecc61f66dd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "cc549ac3-5341-485d-944a-6a87e887dda7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "cc549ac3-5341-485d-944a-6a87e887dda7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:32.493 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:10:32.493 Malloc1p0 00:10:32.493 Malloc1p1 00:10:32.493 Malloc2p0 00:10:32.493 Malloc2p1 00:10:32.493 Malloc2p2 00:10:32.493 Malloc2p3 00:10:32.493 Malloc2p4 00:10:32.493 Malloc2p5 00:10:32.493 Malloc2p6 00:10:32.493 Malloc2p7 00:10:32.493 TestPT 00:10:32.493 raid0 00:10:32.493 concat0 ]] 00:10:32.493 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:32.494 15:46:51 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "df20144b-3412-42df-b0e3-a6af0faea199"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "df20144b-3412-42df-b0e3-a6af0faea199",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "841eae4f-f681-5dc2-80e4-fbc2f0926fd8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "841eae4f-f681-5dc2-80e4-fbc2f0926fd8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "9dfed297-4dfd-5dd2-8b01-7b41830cc24a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "9dfed297-4dfd-5dd2-8b01-7b41830cc24a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "909dd3f0-2477-52f2-8ae6-818f33597236"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "909dd3f0-2477-52f2-8ae6-818f33597236",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8b94ca3b-6e2f-5242-a04b-b5ed93635d9c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "b3f9ac83-d266-578c-b07a-2de2df9b6868"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b3f9ac83-d266-578c-b07a-2de2df9b6868",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "d73479a9-a4d1-5868-8818-6107f497c773"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d73479a9-a4d1-5868-8818-6107f497c773",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "11324ce1-5b36-5e09-b448-f90225da1620"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11324ce1-5b36-5e09-b448-f90225da1620",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "89de5192-fccc-5200-a3b2-8989327a8d2f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "89de5192-fccc-5200-a3b2-8989327a8d2f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7f2fcd30-1715-5317-ac5b-3cba6f0ad3f0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "13ca2c17-e4bb-518d-bfd9-0d303997b38f"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "13ca2c17-e4bb-518d-bfd9-0d303997b38f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "0ebec7f2-d311-523c-a6da-89849c326a8f"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "0ebec7f2-d311-523c-a6da-89849c326a8f",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "2a61e231-61ec-4c2f-aafe-c0fd5b1f102d",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "e84df681-2a24-4f3e-a4fa-c4dd1c22c930",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "e13c88a1-1488-4044-aace-f91811607984",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "fc3cae4a-85a6-45d0-8cef-4e59d486842c"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc3cae4a-85a6-45d0-8cef-4e59d486842c",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "27e4fad7-48be-4bd4-9bc2-d5e03ee46a61",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "22a4b234-e34c-4e10-bb04-a0a979a5a1d6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "e16fa3a4-852b-471a-b21a-6117bfe5373a"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "e16fa3a4-852b-471a-b21a-6117bfe5373a",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "ad0cc246-e7c3-40b7-a94b-ecd40f190cb6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "70e7f65d-f9cc-4b17-a156-fbecc61f66dd",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "cc549ac3-5341-485d-944a-6a87e887dda7"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "cc549ac3-5341-485d-944a-6a87e887dda7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:32.495 15:46:52 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:32.495 ************************************ 00:10:32.495 START TEST bdev_fio_trim 00:10:32.495 ************************************ 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:32.495 15:46:52 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:32.495 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:32.495 fio-3.35 00:10:32.495 Starting 14 threads 00:10:44.750 00:10:44.750 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2483752: Fri Jul 12 15:47:03 2024 00:10:44.750 write: IOPS=148k, BW=578MiB/s (606MB/s)(5785MiB/10002msec); 0 zone resets 00:10:44.750 slat (usec): min=2, max=2119, avg=32.52, stdev=17.18 00:10:44.751 clat (usec): min=10, max=2327, avg=242.25, stdev=112.68 00:10:44.751 lat (usec): min=19, max=2345, avg=274.77, stdev=121.48 00:10:44.751 clat percentiles (usec): 00:10:44.751 | 50.000th=[ 223], 99.000th=[ 660], 99.900th=[ 832], 99.990th=[ 938], 00:10:44.751 | 99.999th=[ 1123] 00:10:44.751 bw ( KiB/s): min=457856, max=798723, per=100.00%, avg=593831.74, stdev=8065.33, samples=266 00:10:44.751 iops : min=114464, max=199679, avg=148457.84, stdev=2016.32, samples=266 00:10:44.751 trim: IOPS=148k, BW=578MiB/s (606MB/s)(5785MiB/10002msec); 0 zone resets 00:10:44.751 slat (usec): min=3, max=686, avg=21.73, stdev=11.47 00:10:44.751 clat (usec): min=3, max=2345, avg=271.02, stdev=124.28 00:10:44.751 lat (usec): min=10, max=2363, avg=292.74, stdev=130.87 00:10:44.751 clat percentiles (usec): 00:10:44.751 | 50.000th=[ 251], 99.000th=[ 734], 99.900th=[ 906], 99.990th=[ 1020], 00:10:44.751 | 99.999th=[ 1106] 00:10:44.751 bw ( KiB/s): min=457856, max=798723, per=100.00%, avg=593832.16, stdev=8065.36, samples=266 00:10:44.751 iops : min=114464, max=199679, avg=148457.95, stdev=2016.33, samples=266 00:10:44.751 lat (usec) : 4=0.01%, 10=0.02%, 20=0.04%, 50=0.37%, 100=3.37% 00:10:44.751 lat (usec) : 250=51.45%, 500=40.58%, 750=3.59%, 1000=0.58% 00:10:44.751 lat (msec) : 2=0.01%, 4=0.01% 00:10:44.751 cpu : usr=99.66%, sys=0.00%, ctx=591, majf=0, minf=1063 00:10:44.751 IO depths : 1=12.5%, 2=24.9%, 4=50.0%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:44.751 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:44.751 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:44.751 issued rwts: total=0,1480994,1480995,0 short=0,0,0,0 dropped=0,0,0,0 00:10:44.751 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:44.751 00:10:44.751 Run status group 0 (all jobs): 00:10:44.751 WRITE: bw=578MiB/s (606MB/s), 578MiB/s-578MiB/s (606MB/s-606MB/s), io=5785MiB (6066MB), run=10002-10002msec 00:10:44.751 TRIM: bw=578MiB/s (606MB/s), 578MiB/s-578MiB/s (606MB/s-606MB/s), io=5785MiB (6066MB), run=10002-10002msec 00:10:44.751 00:10:44.751 real 0m11.246s 00:10:44.751 user 2m31.842s 00:10:44.751 sys 0m0.722s 00:10:44.751 15:47:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.751 15:47:03 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:44.751 ************************************ 00:10:44.751 END TEST bdev_fio_trim 00:10:44.751 ************************************ 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:10:44.751 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:10:44.751 00:10:44.751 real 0m23.437s 00:10:44.751 user 5m20.092s 00:10:44.751 sys 0m2.554s 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:44.751 15:47:03 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:44.751 ************************************ 00:10:44.751 END TEST bdev_fio 00:10:44.751 ************************************ 00:10:44.751 15:47:03 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:44.751 15:47:03 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:44.751 15:47:03 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:44.751 15:47:03 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:44.751 15:47:03 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:44.751 15:47:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:44.751 ************************************ 00:10:44.751 START TEST bdev_verify 00:10:44.751 ************************************ 00:10:44.751 15:47:03 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:44.751 [2024-07-12 15:47:03.486332] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:44.751 [2024-07-12 15:47:03.486378] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2485622 ] 00:10:44.751 [2024-07-12 15:47:03.572178] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:44.751 [2024-07-12 15:47:03.636130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:44.751 [2024-07-12 15:47:03.636135] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:44.751 [2024-07-12 15:47:03.757262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:44.751 [2024-07-12 15:47:03.757305] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:44.751 [2024-07-12 15:47:03.757314] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:44.751 [2024-07-12 15:47:03.765271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:44.751 [2024-07-12 15:47:03.765290] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:44.751 [2024-07-12 15:47:03.773281] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:44.751 [2024-07-12 15:47:03.773297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:44.751 [2024-07-12 15:47:03.834319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:44.751 [2024-07-12 15:47:03.834357] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:44.751 [2024-07-12 15:47:03.834365] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb3ab0 00:10:44.751 [2024-07-12 15:47:03.834372] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:44.751 [2024-07-12 15:47:03.835655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:44.751 [2024-07-12 15:47:03.835674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:44.751 Running I/O for 5 seconds... 00:10:48.955 00:10:48.955 Latency(us) 00:10:48.955 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:48.955 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x1000 00:10:48.955 Malloc0 : 5.14 1393.33 5.44 0.00 0.00 91677.85 415.90 325865.16 00:10:48.955 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x1000 length 0x1000 00:10:48.955 Malloc0 : 5.18 1162.34 4.54 0.00 0.00 109890.63 510.42 354902.65 00:10:48.955 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x800 00:10:48.955 Malloc1p0 : 5.15 721.30 2.82 0.00 0.00 176640.27 1978.68 169385.35 00:10:48.955 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x800 length 0x800 00:10:48.955 Malloc1p0 : 5.21 613.98 2.40 0.00 0.00 207419.24 2419.79 189550.28 00:10:48.955 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x800 00:10:48.955 Malloc1p1 : 5.15 721.06 2.82 0.00 0.00 176326.50 1877.86 160512.79 00:10:48.955 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x800 length 0x800 00:10:48.955 Malloc1p1 : 5.21 613.74 2.40 0.00 0.00 207005.33 2293.76 191163.47 00:10:48.955 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p0 : 5.15 720.83 2.82 0.00 0.00 176052.84 1890.46 155673.21 00:10:48.955 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p0 : 5.22 613.50 2.40 0.00 0.00 206669.00 2318.97 191970.07 00:10:48.955 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p1 : 5.15 720.59 2.81 0.00 0.00 175769.03 1890.46 156479.80 00:10:48.955 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p1 : 5.22 613.26 2.40 0.00 0.00 206316.40 2470.20 192776.66 00:10:48.955 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p2 : 5.15 720.35 2.81 0.00 0.00 175493.91 2268.55 157286.40 00:10:48.955 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p2 : 5.22 613.02 2.39 0.00 0.00 205880.11 2646.65 191970.07 00:10:48.955 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p3 : 5.15 720.12 2.81 0.00 0.00 175162.43 2230.74 155673.21 00:10:48.955 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p3 : 5.22 612.78 2.39 0.00 0.00 205340.18 2545.82 187937.08 00:10:48.955 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p4 : 5.16 719.88 2.81 0.00 0.00 174768.42 2129.92 153253.42 00:10:48.955 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p4 : 5.22 612.54 2.39 0.00 0.00 204831.77 2432.39 186323.89 00:10:48.955 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p5 : 5.16 719.64 2.81 0.00 0.00 174393.55 1978.68 152446.82 00:10:48.955 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p5 : 5.23 612.30 2.39 0.00 0.00 204429.55 2293.76 187937.08 00:10:48.955 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p6 : 5.16 719.41 2.81 0.00 0.00 174086.06 1852.65 154060.01 00:10:48.955 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p6 : 5.23 612.07 2.39 0.00 0.00 204088.74 2230.74 184710.70 00:10:48.955 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x200 00:10:48.955 Malloc2p7 : 5.16 719.18 2.81 0.00 0.00 173813.64 1890.46 156479.80 00:10:48.955 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x200 length 0x200 00:10:48.955 Malloc2p7 : 5.23 611.83 2.39 0.00 0.00 203744.75 2281.16 184710.70 00:10:48.955 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.955 Verification LBA range: start 0x0 length 0x1000 00:10:48.956 TestPT : 5.18 715.94 2.80 0.00 0.00 174123.17 8670.92 154866.61 00:10:48.956 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x1000 length 0x1000 00:10:48.956 TestPT : 5.20 590.62 2.31 0.00 0.00 210538.80 52832.10 185517.29 00:10:48.956 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x0 length 0x2000 00:10:48.956 raid0 : 5.16 718.70 2.81 0.00 0.00 173223.63 1928.27 153253.42 00:10:48.956 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x2000 length 0x2000 00:10:48.956 raid0 : 5.23 611.58 2.39 0.00 0.00 202895.10 2860.90 170191.95 00:10:48.956 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x0 length 0x2000 00:10:48.956 concat0 : 5.19 739.50 2.89 0.00 0.00 168087.01 1915.67 155673.21 00:10:48.956 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x2000 length 0x2000 00:10:48.956 concat0 : 5.23 611.35 2.39 0.00 0.00 202427.87 2520.62 166965.56 00:10:48.956 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x0 length 0x1000 00:10:48.956 raid1 : 5.19 739.25 2.89 0.00 0.00 167812.25 2848.30 159706.19 00:10:48.956 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x1000 length 0x1000 00:10:48.956 raid1 : 5.24 611.10 2.39 0.00 0.00 201930.52 3037.34 175031.53 00:10:48.956 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x0 length 0x4e2 00:10:48.956 AIO0 : 5.20 739.06 2.89 0.00 0.00 167420.86 907.42 169385.35 00:10:48.956 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:10:48.956 Verification LBA range: start 0x4e2 length 0x4e2 00:10:48.956 AIO0 : 5.24 610.92 2.39 0.00 0.00 201410.85 1121.67 181484.31 00:10:48.956 =================================================================================================================== 00:10:48.956 Total : 22575.05 88.18 0.00 0.00 178094.91 415.90 354902.65 00:10:49.216 00:10:49.216 real 0m6.104s 00:10:49.216 user 0m11.576s 00:10:49.216 sys 0m0.246s 00:10:49.216 15:47:09 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:49.216 15:47:09 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:10:49.216 ************************************ 00:10:49.216 END TEST bdev_verify 00:10:49.216 ************************************ 00:10:49.216 15:47:09 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:49.216 15:47:09 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:49.216 15:47:09 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:49.216 15:47:09 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:49.216 15:47:09 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:49.216 ************************************ 00:10:49.216 START TEST bdev_verify_big_io 00:10:49.216 ************************************ 00:10:49.216 15:47:09 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:10:49.477 [2024-07-12 15:47:09.695788] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:49.477 [2024-07-12 15:47:09.695925] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2486577 ] 00:10:49.477 [2024-07-12 15:47:09.838991] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:49.477 [2024-07-12 15:47:09.916940] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:49.477 [2024-07-12 15:47:09.917058] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.738 [2024-07-12 15:47:10.049646] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:49.738 [2024-07-12 15:47:10.049686] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:49.738 [2024-07-12 15:47:10.049694] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:49.738 [2024-07-12 15:47:10.057654] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:49.738 [2024-07-12 15:47:10.057672] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:49.738 [2024-07-12 15:47:10.065667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:49.738 [2024-07-12 15:47:10.065684] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:49.738 [2024-07-12 15:47:10.127163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:49.738 [2024-07-12 15:47:10.127200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:49.738 [2024-07-12 15:47:10.127209] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x276eab0 00:10:49.738 [2024-07-12 15:47:10.127216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:49.738 [2024-07-12 15:47:10.128518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:49.738 [2024-07-12 15:47:10.128538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:49.999 [2024-07-12 15:47:10.277255] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.278082] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.279237] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.280061] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.281217] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.282019] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.283180] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.284337] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.285148] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.286298] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.287110] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.288261] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.289066] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.290223] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.291050] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.292208] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:10:49.999 [2024-07-12 15:47:10.307887] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:49.999 [2024-07-12 15:47:10.309122] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:10:49.999 Running I/O for 5 seconds... 00:10:58.196 00:10:58.196 Latency(us) 00:10:58.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:58.196 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x100 00:10:58.196 Malloc0 : 6.11 146.74 9.17 0.00 0.00 855190.81 721.53 2400432.44 00:10:58.196 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x100 length 0x100 00:10:58.196 Malloc0 : 6.10 125.80 7.86 0.00 0.00 994608.50 901.12 2723071.21 00:10:58.196 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x80 00:10:58.196 Malloc1p0 : 6.39 85.78 5.36 0.00 0.00 1372885.24 1840.05 2787598.97 00:10:58.196 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x80 length 0x80 00:10:58.196 Malloc1p0 : 7.08 31.63 1.98 0.00 0.00 3629348.75 1449.35 5936553.35 00:10:58.196 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x80 00:10:58.196 Malloc1p1 : 6.69 35.85 2.24 0.00 0.00 3154734.83 1222.50 5497764.63 00:10:58.196 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x80 length 0x80 00:10:58.196 Malloc1p1 : 7.08 31.62 1.98 0.00 0.00 3488316.86 1531.27 5678442.34 00:10:58.196 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p0 : 6.28 22.92 1.43 0.00 0.00 1229783.49 485.22 2051982.57 00:10:58.196 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p0 : 6.54 22.02 1.38 0.00 0.00 1266176.75 579.74 2258471.38 00:10:58.196 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p1 : 6.28 22.91 1.43 0.00 0.00 1217879.99 554.54 2013265.92 00:10:58.196 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p1 : 6.54 22.02 1.38 0.00 0.00 1252833.05 576.59 2232660.28 00:10:58.196 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p2 : 6.29 22.91 1.43 0.00 0.00 1207135.37 478.92 1987454.82 00:10:58.196 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p2 : 6.54 22.01 1.38 0.00 0.00 1239480.42 573.44 2193943.63 00:10:58.196 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p3 : 6.29 22.91 1.43 0.00 0.00 1195996.87 491.52 1961643.72 00:10:58.196 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p3 : 6.54 22.01 1.38 0.00 0.00 1226756.23 589.19 2168132.53 00:10:58.196 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p4 : 6.29 22.90 1.43 0.00 0.00 1184193.87 485.22 1935832.62 00:10:58.196 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p4 : 6.54 22.00 1.38 0.00 0.00 1213534.73 576.59 2142321.43 00:10:58.196 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.196 Malloc2p5 : 6.29 22.90 1.43 0.00 0.00 1172606.22 485.22 1910021.51 00:10:58.196 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x20 length 0x20 00:10:58.196 Malloc2p5 : 6.55 22.00 1.37 0.00 0.00 1200050.82 573.44 2103604.78 00:10:58.196 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.196 Verification LBA range: start 0x0 length 0x20 00:10:58.197 Malloc2p6 : 6.39 25.04 1.57 0.00 0.00 1072595.74 485.22 1884210.41 00:10:58.197 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x20 length 0x20 00:10:58.197 Malloc2p6 : 6.55 21.99 1.37 0.00 0.00 1186147.54 563.99 2077793.67 00:10:58.197 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x20 00:10:58.197 Malloc2p7 : 6.39 25.04 1.56 0.00 0.00 1062186.37 482.07 1858399.31 00:10:58.197 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x20 length 0x20 00:10:58.197 Malloc2p7 : 6.55 21.99 1.37 0.00 0.00 1172882.35 573.44 2039077.02 00:10:58.197 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x100 00:10:58.197 TestPT : 6.84 35.37 2.21 0.00 0.00 2847993.74 106470.79 3974909.64 00:10:58.197 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x100 length 0x100 00:10:58.197 TestPT : 7.10 31.56 1.97 0.00 0.00 3144251.64 105664.20 3949098.54 00:10:58.197 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x200 00:10:58.197 raid0 : 6.89 39.50 2.47 0.00 0.00 2469609.28 1272.91 4749242.68 00:10:58.197 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x200 length 0x200 00:10:58.197 raid0 : 7.04 39.50 2.47 0.00 0.00 2424099.29 1556.48 4749242.68 00:10:58.197 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x200 00:10:58.197 concat0 : 6.89 46.46 2.90 0.00 0.00 2058182.27 1279.21 4568564.97 00:10:58.197 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x200 length 0x200 00:10:58.197 concat0 : 7.06 49.85 3.12 0.00 0.00 1893701.81 1556.48 4516942.77 00:10:58.197 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x100 00:10:58.197 raid1 : 6.89 60.39 3.77 0.00 0.00 1561043.57 1625.80 4387887.26 00:10:58.197 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x100 length 0x100 00:10:58.197 raid1 : 7.06 59.88 3.74 0.00 0.00 1506317.18 2003.89 4336265.06 00:10:58.197 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x0 length 0x4e 00:10:58.197 AIO0 : 7.00 68.58 4.29 0.00 0.00 819635.67 595.50 2877937.82 00:10:58.197 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:58.197 Verification LBA range: start 0x4e length 0x4e 00:10:58.197 AIO0 : 7.22 100.06 6.25 0.00 0.00 534457.17 343.43 3110237.74 00:10:58.197 =================================================================================================================== 00:10:58.197 Total : 1352.13 84.51 0.00 0.00 1511295.81 343.43 5936553.35 00:10:58.197 00:10:58.197 real 0m8.235s 00:10:58.197 user 0m15.590s 00:10:58.197 sys 0m0.329s 00:10:58.197 15:47:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:58.197 15:47:17 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:58.197 ************************************ 00:10:58.197 END TEST bdev_verify_big_io 00:10:58.197 ************************************ 00:10:58.197 15:47:17 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:58.197 15:47:17 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:58.197 15:47:17 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:58.197 15:47:17 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:58.197 15:47:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:58.197 ************************************ 00:10:58.197 START TEST bdev_write_zeroes 00:10:58.197 ************************************ 00:10:58.197 15:47:17 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:58.197 [2024-07-12 15:47:17.962976] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:58.197 [2024-07-12 15:47:17.963018] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2488100 ] 00:10:58.197 [2024-07-12 15:47:18.050504] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.197 [2024-07-12 15:47:18.117240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.197 [2024-07-12 15:47:18.234381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:58.197 [2024-07-12 15:47:18.234422] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:58.197 [2024-07-12 15:47:18.234431] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:58.197 [2024-07-12 15:47:18.242386] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:58.197 [2024-07-12 15:47:18.242405] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:58.197 [2024-07-12 15:47:18.250399] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:58.197 [2024-07-12 15:47:18.250415] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:58.197 [2024-07-12 15:47:18.311120] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:58.197 [2024-07-12 15:47:18.311159] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:58.197 [2024-07-12 15:47:18.311169] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x211fb30 00:10:58.197 [2024-07-12 15:47:18.311176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:58.197 [2024-07-12 15:47:18.312304] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:58.197 [2024-07-12 15:47:18.312324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:58.197 Running I/O for 1 seconds... 00:10:59.139 00:10:59.139 Latency(us) 00:10:59.139 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:59.139 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc0 : 1.04 6014.84 23.50 0.00 0.00 21264.05 513.58 35691.91 00:10:59.139 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc1p0 : 1.04 6007.51 23.47 0.00 0.00 21264.53 771.94 35086.97 00:10:59.139 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc1p1 : 1.05 6000.24 23.44 0.00 0.00 21250.68 746.73 34280.37 00:10:59.139 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p0 : 1.05 5992.92 23.41 0.00 0.00 21236.19 762.49 33473.77 00:10:59.139 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p1 : 1.05 5985.69 23.38 0.00 0.00 21226.72 756.18 32868.82 00:10:59.139 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p2 : 1.05 5978.48 23.35 0.00 0.00 21215.53 768.79 32062.23 00:10:59.139 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p3 : 1.05 5971.25 23.33 0.00 0.00 21197.59 765.64 31255.63 00:10:59.139 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p4 : 1.05 5964.04 23.30 0.00 0.00 21187.83 765.64 30650.68 00:10:59.139 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p5 : 1.05 5956.88 23.27 0.00 0.00 21175.50 762.49 30045.74 00:10:59.139 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p6 : 1.05 5949.69 23.24 0.00 0.00 21160.74 762.49 29239.14 00:10:59.139 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 Malloc2p7 : 1.06 5942.55 23.21 0.00 0.00 21146.71 762.49 28432.54 00:10:59.139 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 TestPT : 1.06 5935.41 23.19 0.00 0.00 21136.52 784.54 27827.59 00:10:59.139 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 raid0 : 1.06 5927.24 23.15 0.00 0.00 21113.30 1424.15 26416.05 00:10:59.139 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 concat0 : 1.06 5919.22 23.12 0.00 0.00 21072.77 1424.15 24903.68 00:10:59.139 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 raid1 : 1.06 5909.18 23.08 0.00 0.00 21025.87 2218.14 22685.54 00:10:59.139 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:59.139 AIO0 : 1.06 5903.02 23.06 0.00 0.00 20961.87 793.99 21878.94 00:10:59.139 =================================================================================================================== 00:10:59.139 Total : 95358.16 372.49 0.00 0.00 21164.77 513.58 35691.91 00:10:59.399 00:10:59.399 real 0m1.886s 00:10:59.399 user 0m1.596s 00:10:59.399 sys 0m0.223s 00:10:59.399 15:47:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:59.399 15:47:19 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:59.399 ************************************ 00:10:59.399 END TEST bdev_write_zeroes 00:10:59.399 ************************************ 00:10:59.399 15:47:19 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:59.399 15:47:19 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:59.399 15:47:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:59.399 15:47:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:59.399 15:47:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:59.660 ************************************ 00:10:59.660 START TEST bdev_json_nonenclosed 00:10:59.660 ************************************ 00:10:59.660 15:47:19 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:59.660 [2024-07-12 15:47:19.923996] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:59.660 [2024-07-12 15:47:19.924041] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2488425 ] 00:10:59.660 [2024-07-12 15:47:20.008800] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:59.660 [2024-07-12 15:47:20.074158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.660 [2024-07-12 15:47:20.074214] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:59.660 [2024-07-12 15:47:20.074225] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:59.660 [2024-07-12 15:47:20.074232] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:59.922 00:10:59.922 real 0m0.265s 00:10:59.922 user 0m0.163s 00:10:59.922 sys 0m0.099s 00:10:59.922 15:47:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:10:59.922 15:47:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:59.922 15:47:20 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:59.922 ************************************ 00:10:59.922 END TEST bdev_json_nonenclosed 00:10:59.922 ************************************ 00:10:59.922 15:47:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:10:59.922 15:47:20 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:10:59.922 15:47:20 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:59.922 15:47:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:10:59.922 15:47:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:59.922 15:47:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:59.922 ************************************ 00:10:59.922 START TEST bdev_json_nonarray 00:10:59.922 ************************************ 00:10:59.922 15:47:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:59.922 [2024-07-12 15:47:20.265248] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:10:59.922 [2024-07-12 15:47:20.265291] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2488457 ] 00:10:59.922 [2024-07-12 15:47:20.350636] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:00.183 [2024-07-12 15:47:20.413848] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:00.183 [2024-07-12 15:47:20.413908] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:11:00.183 [2024-07-12 15:47:20.413919] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:11:00.183 [2024-07-12 15:47:20.413926] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:00.183 00:11:00.183 real 0m0.257s 00:11:00.183 user 0m0.158s 00:11:00.183 sys 0m0.098s 00:11:00.183 15:47:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:11:00.183 15:47:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:00.183 15:47:20 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:11:00.183 ************************************ 00:11:00.183 END TEST bdev_json_nonarray 00:11:00.183 ************************************ 00:11:00.183 15:47:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:11:00.183 15:47:20 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:11:00.183 15:47:20 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:11:00.183 15:47:20 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:11:00.183 15:47:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:00.183 15:47:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:00.183 15:47:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:00.183 ************************************ 00:11:00.183 START TEST bdev_qos 00:11:00.183 ************************************ 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2488480 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2488480' 00:11:00.183 Process qos testing pid: 2488480 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2488480 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 2488480 ']' 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:00.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:00.183 15:47:20 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.444 [2024-07-12 15:47:20.646749] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:00.444 [2024-07-12 15:47:20.646883] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2488480 ] 00:11:00.444 [2024-07-12 15:47:20.780763] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:00.444 [2024-07-12 15:47:20.880037] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.705 Malloc_0 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.705 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.966 [ 00:11:00.966 { 00:11:00.966 "name": "Malloc_0", 00:11:00.966 "aliases": [ 00:11:00.966 "b3426882-d889-462b-bb9d-8da76158c2f9" 00:11:00.966 ], 00:11:00.966 "product_name": "Malloc disk", 00:11:00.966 "block_size": 512, 00:11:00.966 "num_blocks": 262144, 00:11:00.966 "uuid": "b3426882-d889-462b-bb9d-8da76158c2f9", 00:11:00.966 "assigned_rate_limits": { 00:11:00.966 "rw_ios_per_sec": 0, 00:11:00.966 "rw_mbytes_per_sec": 0, 00:11:00.966 "r_mbytes_per_sec": 0, 00:11:00.966 "w_mbytes_per_sec": 0 00:11:00.966 }, 00:11:00.966 "claimed": false, 00:11:00.966 "zoned": false, 00:11:00.966 "supported_io_types": { 00:11:00.966 "read": true, 00:11:00.966 "write": true, 00:11:00.966 "unmap": true, 00:11:00.966 "flush": true, 00:11:00.966 "reset": true, 00:11:00.966 "nvme_admin": false, 00:11:00.966 "nvme_io": false, 00:11:00.966 "nvme_io_md": false, 00:11:00.966 "write_zeroes": true, 00:11:00.966 "zcopy": true, 00:11:00.966 "get_zone_info": false, 00:11:00.966 "zone_management": false, 00:11:00.966 "zone_append": false, 00:11:00.966 "compare": false, 00:11:00.966 "compare_and_write": false, 00:11:00.966 "abort": true, 00:11:00.966 "seek_hole": false, 00:11:00.966 "seek_data": false, 00:11:00.966 "copy": true, 00:11:00.966 "nvme_iov_md": false 00:11:00.966 }, 00:11:00.966 "memory_domains": [ 00:11:00.966 { 00:11:00.966 "dma_device_id": "system", 00:11:00.966 "dma_device_type": 1 00:11:00.966 }, 00:11:00.966 { 00:11:00.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:00.966 "dma_device_type": 2 00:11:00.966 } 00:11:00.966 ], 00:11:00.966 "driver_specific": {} 00:11:00.966 } 00:11:00.966 ] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.966 Null_1 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:00.966 [ 00:11:00.966 { 00:11:00.966 "name": "Null_1", 00:11:00.966 "aliases": [ 00:11:00.966 "28c29079-16c2-4fa1-8c0e-e1fcd075362d" 00:11:00.966 ], 00:11:00.966 "product_name": "Null disk", 00:11:00.966 "block_size": 512, 00:11:00.966 "num_blocks": 262144, 00:11:00.966 "uuid": "28c29079-16c2-4fa1-8c0e-e1fcd075362d", 00:11:00.966 "assigned_rate_limits": { 00:11:00.966 "rw_ios_per_sec": 0, 00:11:00.966 "rw_mbytes_per_sec": 0, 00:11:00.966 "r_mbytes_per_sec": 0, 00:11:00.966 "w_mbytes_per_sec": 0 00:11:00.966 }, 00:11:00.966 "claimed": false, 00:11:00.966 "zoned": false, 00:11:00.966 "supported_io_types": { 00:11:00.966 "read": true, 00:11:00.966 "write": true, 00:11:00.966 "unmap": false, 00:11:00.966 "flush": false, 00:11:00.966 "reset": true, 00:11:00.966 "nvme_admin": false, 00:11:00.966 "nvme_io": false, 00:11:00.966 "nvme_io_md": false, 00:11:00.966 "write_zeroes": true, 00:11:00.966 "zcopy": false, 00:11:00.966 "get_zone_info": false, 00:11:00.966 "zone_management": false, 00:11:00.966 "zone_append": false, 00:11:00.966 "compare": false, 00:11:00.966 "compare_and_write": false, 00:11:00.966 "abort": true, 00:11:00.966 "seek_hole": false, 00:11:00.966 "seek_data": false, 00:11:00.966 "copy": false, 00:11:00.966 "nvme_iov_md": false 00:11:00.966 }, 00:11:00.966 "driver_specific": {} 00:11:00.966 } 00:11:00.966 ] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:00.966 15:47:21 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:00.966 Running I/O for 60 seconds... 00:11:06.297 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 58329.18 233316.72 0.00 0.00 234496.00 0.00 0.00 ' 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=58329.18 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 58329 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=58329 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=14000 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 14000 -gt 1000 ']' 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 14000 Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 14000 IOPS Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:06.298 15:47:26 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:06.298 ************************************ 00:11:06.298 START TEST bdev_qos_iops 00:11:06.298 ************************************ 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 14000 IOPS Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=14000 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:06.298 15:47:26 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 13999.40 55997.61 0.00 0.00 57176.00 0.00 0.00 ' 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=13999.40 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 13999 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=13999 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=12600 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=15400 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 13999 -lt 12600 ']' 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 13999 -gt 15400 ']' 00:11:11.584 00:11:11.584 real 0m5.240s 00:11:11.584 user 0m0.107s 00:11:11.584 sys 0m0.046s 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:11.584 15:47:31 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:11:11.584 ************************************ 00:11:11.584 END TEST bdev_qos_iops 00:11:11.584 ************************************ 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:11.584 15:47:31 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 20412.65 81650.59 0.00 0.00 82944.00 0.00 0.00 ' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=82944.00 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 82944 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=82944 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=8 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 8 -lt 2 ']' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 8 Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 8 BANDWIDTH Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:16.870 15:47:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:16.870 ************************************ 00:11:16.870 START TEST bdev_qos_bw 00:11:16.870 ************************************ 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 8 BANDWIDTH Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=8 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:16.870 15:47:37 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 2047.87 8191.46 0.00 0.00 8340.00 0.00 0.00 ' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=8340.00 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 8340 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=8340 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=8192 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=7372 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=9011 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8340 -lt 7372 ']' 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 8340 -gt 9011 ']' 00:11:22.156 00:11:22.156 real 0m5.263s 00:11:22.156 user 0m0.110s 00:11:22.156 sys 0m0.045s 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:11:22.156 ************************************ 00:11:22.156 END TEST bdev_qos_bw 00:11:22.156 ************************************ 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:22.156 15:47:42 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:22.156 ************************************ 00:11:22.156 START TEST bdev_qos_ro_bw 00:11:22.156 ************************************ 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:22.156 15:47:42 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.60 2046.41 0.00 0.00 2060.00 0.00 0.00 ' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2060.00 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2060 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2060 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -lt 1843 ']' 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2060 -gt 2252 ']' 00:11:27.443 00:11:27.443 real 0m5.176s 00:11:27.443 user 0m0.108s 00:11:27.443 sys 0m0.041s 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:27.443 15:47:47 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:27.443 ************************************ 00:11:27.443 END TEST bdev_qos_ro_bw 00:11:27.443 ************************************ 00:11:27.443 15:47:47 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:27.443 15:47:47 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:27.443 15:47:47 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:27.443 15:47:47 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:28.014 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:28.014 15:47:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:11:28.014 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:28.014 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:28.275 00:11:28.275 Latency(us) 00:11:28.275 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:28.275 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:28.275 Malloc_0 : 26.93 19676.88 76.86 0.00 0.00 12887.07 2255.95 503316.48 00:11:28.275 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:28.275 Null_1 : 27.08 20047.12 78.31 0.00 0.00 12738.50 920.02 153253.42 00:11:28.275 =================================================================================================================== 00:11:28.275 Total : 39724.00 155.17 0.00 0.00 12811.89 920.02 503316.48 00:11:28.275 0 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2488480 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 2488480 ']' 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 2488480 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2488480 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2488480' 00:11:28.275 killing process with pid 2488480 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 2488480 00:11:28.275 Received shutdown signal, test time was about 27.146473 seconds 00:11:28.275 00:11:28.275 Latency(us) 00:11:28.275 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:28.275 =================================================================================================================== 00:11:28.275 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 2488480 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:11:28.275 00:11:28.275 real 0m28.148s 00:11:28.275 user 0m29.112s 00:11:28.275 sys 0m0.798s 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:28.275 15:47:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:28.275 ************************************ 00:11:28.275 END TEST bdev_qos 00:11:28.275 ************************************ 00:11:28.536 15:47:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:28.536 15:47:48 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:28.536 15:47:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:28.536 15:47:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:28.536 15:47:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:28.536 ************************************ 00:11:28.536 START TEST bdev_qd_sampling 00:11:28.536 ************************************ 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2493188 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2493188' 00:11:28.536 Process bdev QD sampling period testing pid: 2493188 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2493188 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 2493188 ']' 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:28.536 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:28.536 15:47:48 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:28.536 [2024-07-12 15:47:48.828985] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:28.536 [2024-07-12 15:47:48.829037] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2493188 ] 00:11:28.536 [2024-07-12 15:47:48.919775] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:28.797 [2024-07-12 15:47:49.015040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:28.797 [2024-07-12 15:47:49.015047] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:29.367 Malloc_QD 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:29.367 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:29.367 [ 00:11:29.367 { 00:11:29.367 "name": "Malloc_QD", 00:11:29.367 "aliases": [ 00:11:29.367 "49ff981b-d5b3-46e0-8bcd-2fc1859efffb" 00:11:29.367 ], 00:11:29.367 "product_name": "Malloc disk", 00:11:29.367 "block_size": 512, 00:11:29.367 "num_blocks": 262144, 00:11:29.367 "uuid": "49ff981b-d5b3-46e0-8bcd-2fc1859efffb", 00:11:29.367 "assigned_rate_limits": { 00:11:29.367 "rw_ios_per_sec": 0, 00:11:29.367 "rw_mbytes_per_sec": 0, 00:11:29.367 "r_mbytes_per_sec": 0, 00:11:29.367 "w_mbytes_per_sec": 0 00:11:29.367 }, 00:11:29.367 "claimed": false, 00:11:29.367 "zoned": false, 00:11:29.367 "supported_io_types": { 00:11:29.367 "read": true, 00:11:29.367 "write": true, 00:11:29.367 "unmap": true, 00:11:29.367 "flush": true, 00:11:29.367 "reset": true, 00:11:29.367 "nvme_admin": false, 00:11:29.367 "nvme_io": false, 00:11:29.367 "nvme_io_md": false, 00:11:29.367 "write_zeroes": true, 00:11:29.367 "zcopy": true, 00:11:29.367 "get_zone_info": false, 00:11:29.367 "zone_management": false, 00:11:29.367 "zone_append": false, 00:11:29.367 "compare": false, 00:11:29.367 "compare_and_write": false, 00:11:29.367 "abort": true, 00:11:29.367 "seek_hole": false, 00:11:29.367 "seek_data": false, 00:11:29.367 "copy": true, 00:11:29.367 "nvme_iov_md": false 00:11:29.367 }, 00:11:29.367 "memory_domains": [ 00:11:29.367 { 00:11:29.367 "dma_device_id": "system", 00:11:29.367 "dma_device_type": 1 00:11:29.367 }, 00:11:29.367 { 00:11:29.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.367 "dma_device_type": 2 00:11:29.368 } 00:11:29.368 ], 00:11:29.368 "driver_specific": {} 00:11:29.368 } 00:11:29.368 ] 00:11:29.368 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:29.368 15:47:49 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:11:29.368 15:47:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:11:29.368 15:47:49 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:29.627 Running I/O for 5 seconds... 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:11:31.541 "tick_rate": 2600000000, 00:11:31.541 "ticks": 11940825140037223, 00:11:31.541 "bdevs": [ 00:11:31.541 { 00:11:31.541 "name": "Malloc_QD", 00:11:31.541 "bytes_read": 1105244672, 00:11:31.541 "num_read_ops": 269828, 00:11:31.541 "bytes_written": 0, 00:11:31.541 "num_write_ops": 0, 00:11:31.541 "bytes_unmapped": 0, 00:11:31.541 "num_unmap_ops": 0, 00:11:31.541 "bytes_copied": 0, 00:11:31.541 "num_copy_ops": 0, 00:11:31.541 "read_latency_ticks": 2575348887662, 00:11:31.541 "max_read_latency_ticks": 12947128, 00:11:31.541 "min_read_latency_ticks": 282806, 00:11:31.541 "write_latency_ticks": 0, 00:11:31.541 "max_write_latency_ticks": 0, 00:11:31.541 "min_write_latency_ticks": 0, 00:11:31.541 "unmap_latency_ticks": 0, 00:11:31.541 "max_unmap_latency_ticks": 0, 00:11:31.541 "min_unmap_latency_ticks": 0, 00:11:31.541 "copy_latency_ticks": 0, 00:11:31.541 "max_copy_latency_ticks": 0, 00:11:31.541 "min_copy_latency_ticks": 0, 00:11:31.541 "io_error": {}, 00:11:31.541 "queue_depth_polling_period": 10, 00:11:31.541 "queue_depth": 512, 00:11:31.541 "io_time": 40, 00:11:31.541 "weighted_io_time": 20480 00:11:31.541 } 00:11:31.541 ] 00:11:31.541 }' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:31.541 00:11:31.541 Latency(us) 00:11:31.541 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:31.541 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:31.541 Malloc_QD : 2.04 73350.42 286.53 0.00 0.00 3482.74 1064.96 4083.40 00:11:31.541 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:31.541 Malloc_QD : 2.04 65976.93 257.72 0.00 0.00 3871.40 1033.45 4990.82 00:11:31.541 =================================================================================================================== 00:11:31.541 Total : 139327.35 544.25 0.00 0.00 3666.93 1033.45 4990.82 00:11:31.541 0 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2493188 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 2493188 ']' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 2493188 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2493188 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2493188' 00:11:31.541 killing process with pid 2493188 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 2493188 00:11:31.541 Received shutdown signal, test time was about 2.125342 seconds 00:11:31.541 00:11:31.541 Latency(us) 00:11:31.541 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:31.541 =================================================================================================================== 00:11:31.541 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:31.541 15:47:51 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 2493188 00:11:31.801 15:47:52 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:11:31.801 00:11:31.801 real 0m3.356s 00:11:31.801 user 0m6.649s 00:11:31.801 sys 0m0.371s 00:11:31.802 15:47:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:31.802 15:47:52 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:31.802 ************************************ 00:11:31.802 END TEST bdev_qd_sampling 00:11:31.802 ************************************ 00:11:31.802 15:47:52 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:31.802 15:47:52 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:11:31.802 15:47:52 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:31.802 15:47:52 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:31.802 15:47:52 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:31.802 ************************************ 00:11:31.802 START TEST bdev_error 00:11:31.802 ************************************ 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2493808 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2493808' 00:11:31.802 Process error testing pid: 2493808 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2493808 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2493808 ']' 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:31.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:31.802 15:47:52 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:31.802 15:47:52 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:32.061 [2024-07-12 15:47:52.270507] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:32.061 [2024-07-12 15:47:52.270584] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2493808 ] 00:11:32.061 [2024-07-12 15:47:52.354026] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:32.061 [2024-07-12 15:47:52.452436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:33.005 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 Dev_1 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 [ 00:11:33.005 { 00:11:33.005 "name": "Dev_1", 00:11:33.005 "aliases": [ 00:11:33.005 "d35e2f29-6ae0-4fa0-a000-2b6c96cf8b16" 00:11:33.005 ], 00:11:33.005 "product_name": "Malloc disk", 00:11:33.005 "block_size": 512, 00:11:33.005 "num_blocks": 262144, 00:11:33.005 "uuid": "d35e2f29-6ae0-4fa0-a000-2b6c96cf8b16", 00:11:33.005 "assigned_rate_limits": { 00:11:33.005 "rw_ios_per_sec": 0, 00:11:33.005 "rw_mbytes_per_sec": 0, 00:11:33.005 "r_mbytes_per_sec": 0, 00:11:33.005 "w_mbytes_per_sec": 0 00:11:33.005 }, 00:11:33.005 "claimed": false, 00:11:33.005 "zoned": false, 00:11:33.005 "supported_io_types": { 00:11:33.005 "read": true, 00:11:33.005 "write": true, 00:11:33.005 "unmap": true, 00:11:33.005 "flush": true, 00:11:33.005 "reset": true, 00:11:33.005 "nvme_admin": false, 00:11:33.005 "nvme_io": false, 00:11:33.005 "nvme_io_md": false, 00:11:33.005 "write_zeroes": true, 00:11:33.005 "zcopy": true, 00:11:33.005 "get_zone_info": false, 00:11:33.005 "zone_management": false, 00:11:33.005 "zone_append": false, 00:11:33.005 "compare": false, 00:11:33.005 "compare_and_write": false, 00:11:33.005 "abort": true, 00:11:33.005 "seek_hole": false, 00:11:33.005 "seek_data": false, 00:11:33.005 "copy": true, 00:11:33.005 "nvme_iov_md": false 00:11:33.005 }, 00:11:33.005 "memory_domains": [ 00:11:33.005 { 00:11:33.005 "dma_device_id": "system", 00:11:33.005 "dma_device_type": 1 00:11:33.005 }, 00:11:33.005 { 00:11:33.005 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.005 "dma_device_type": 2 00:11:33.005 } 00:11:33.005 ], 00:11:33.005 "driver_specific": {} 00:11:33.005 } 00:11:33.005 ] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:33.005 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 true 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 Dev_2 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.005 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.005 [ 00:11:33.005 { 00:11:33.005 "name": "Dev_2", 00:11:33.005 "aliases": [ 00:11:33.005 "4128d670-26a1-4d41-b097-d45da041265a" 00:11:33.005 ], 00:11:33.005 "product_name": "Malloc disk", 00:11:33.005 "block_size": 512, 00:11:33.005 "num_blocks": 262144, 00:11:33.005 "uuid": "4128d670-26a1-4d41-b097-d45da041265a", 00:11:33.005 "assigned_rate_limits": { 00:11:33.005 "rw_ios_per_sec": 0, 00:11:33.005 "rw_mbytes_per_sec": 0, 00:11:33.005 "r_mbytes_per_sec": 0, 00:11:33.005 "w_mbytes_per_sec": 0 00:11:33.006 }, 00:11:33.006 "claimed": false, 00:11:33.006 "zoned": false, 00:11:33.006 "supported_io_types": { 00:11:33.006 "read": true, 00:11:33.006 "write": true, 00:11:33.006 "unmap": true, 00:11:33.006 "flush": true, 00:11:33.006 "reset": true, 00:11:33.006 "nvme_admin": false, 00:11:33.006 "nvme_io": false, 00:11:33.006 "nvme_io_md": false, 00:11:33.006 "write_zeroes": true, 00:11:33.006 "zcopy": true, 00:11:33.006 "get_zone_info": false, 00:11:33.006 "zone_management": false, 00:11:33.006 "zone_append": false, 00:11:33.006 "compare": false, 00:11:33.006 "compare_and_write": false, 00:11:33.006 "abort": true, 00:11:33.006 "seek_hole": false, 00:11:33.006 "seek_data": false, 00:11:33.006 "copy": true, 00:11:33.006 "nvme_iov_md": false 00:11:33.006 }, 00:11:33.006 "memory_domains": [ 00:11:33.006 { 00:11:33.006 "dma_device_id": "system", 00:11:33.006 "dma_device_type": 1 00:11:33.006 }, 00:11:33.006 { 00:11:33.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.006 "dma_device_type": 2 00:11:33.006 } 00:11:33.006 ], 00:11:33.006 "driver_specific": {} 00:11:33.006 } 00:11:33.006 ] 00:11:33.006 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.006 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:33.006 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:33.006 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.006 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.006 15:47:53 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.006 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:11:33.006 15:47:53 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:33.006 Running I/O for 5 seconds... 00:11:33.948 15:47:54 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2493808 00:11:33.948 15:47:54 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2493808' 00:11:33.948 Process is existed as continue on error is set. Pid: 2493808 00:11:33.948 15:47:54 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.948 15:47:54 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:33.948 15:47:54 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:33.948 15:47:54 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:11:33.948 Timeout while waiting for response: 00:11:33.948 00:11:33.948 00:11:38.153 00:11:38.153 Latency(us) 00:11:38.153 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:38.153 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:38.153 EE_Dev_1 : 0.91 35442.82 138.45 5.52 0.00 447.67 153.60 743.58 00:11:38.153 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:38.153 Dev_2 : 5.00 77357.44 302.18 0.00 0.00 203.22 69.32 18450.90 00:11:38.153 =================================================================================================================== 00:11:38.153 Total : 112800.26 440.63 5.52 0.00 221.97 69.32 18450.90 00:11:39.154 15:47:59 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2493808 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 2493808 ']' 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 2493808 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2493808 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2493808' 00:11:39.154 killing process with pid 2493808 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 2493808 00:11:39.154 Received shutdown signal, test time was about 5.000000 seconds 00:11:39.154 00:11:39.154 Latency(us) 00:11:39.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:39.154 =================================================================================================================== 00:11:39.154 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 2493808 00:11:39.154 15:47:59 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2495037 00:11:39.154 15:47:59 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2495037' 00:11:39.154 Process error testing pid: 2495037 00:11:39.154 15:47:59 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:39.154 15:47:59 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2495037 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 2495037 ']' 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:39.154 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:39.154 15:47:59 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:39.414 [2024-07-12 15:47:59.629589] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:39.414 [2024-07-12 15:47:59.629655] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2495037 ] 00:11:39.414 [2024-07-12 15:47:59.711614] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:39.414 [2024-07-12 15:47:59.809838] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 Dev_1 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 [ 00:11:40.383 { 00:11:40.383 "name": "Dev_1", 00:11:40.383 "aliases": [ 00:11:40.383 "fa14c0aa-8f67-49a9-b331-0c88d7f5921b" 00:11:40.383 ], 00:11:40.383 "product_name": "Malloc disk", 00:11:40.383 "block_size": 512, 00:11:40.383 "num_blocks": 262144, 00:11:40.383 "uuid": "fa14c0aa-8f67-49a9-b331-0c88d7f5921b", 00:11:40.383 "assigned_rate_limits": { 00:11:40.383 "rw_ios_per_sec": 0, 00:11:40.383 "rw_mbytes_per_sec": 0, 00:11:40.383 "r_mbytes_per_sec": 0, 00:11:40.383 "w_mbytes_per_sec": 0 00:11:40.383 }, 00:11:40.383 "claimed": false, 00:11:40.383 "zoned": false, 00:11:40.383 "supported_io_types": { 00:11:40.383 "read": true, 00:11:40.383 "write": true, 00:11:40.383 "unmap": true, 00:11:40.383 "flush": true, 00:11:40.383 "reset": true, 00:11:40.383 "nvme_admin": false, 00:11:40.383 "nvme_io": false, 00:11:40.383 "nvme_io_md": false, 00:11:40.383 "write_zeroes": true, 00:11:40.383 "zcopy": true, 00:11:40.383 "get_zone_info": false, 00:11:40.383 "zone_management": false, 00:11:40.383 "zone_append": false, 00:11:40.383 "compare": false, 00:11:40.383 "compare_and_write": false, 00:11:40.383 "abort": true, 00:11:40.383 "seek_hole": false, 00:11:40.383 "seek_data": false, 00:11:40.383 "copy": true, 00:11:40.383 "nvme_iov_md": false 00:11:40.383 }, 00:11:40.383 "memory_domains": [ 00:11:40.383 { 00:11:40.383 "dma_device_id": "system", 00:11:40.383 "dma_device_type": 1 00:11:40.383 }, 00:11:40.383 { 00:11:40.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.383 "dma_device_type": 2 00:11:40.383 } 00:11:40.383 ], 00:11:40.383 "driver_specific": {} 00:11:40.383 } 00:11:40.383 ] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 true 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 Dev_2 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 [ 00:11:40.383 { 00:11:40.383 "name": "Dev_2", 00:11:40.383 "aliases": [ 00:11:40.383 "7d6d34e7-2a41-451c-bd8c-7c05280d27a9" 00:11:40.383 ], 00:11:40.383 "product_name": "Malloc disk", 00:11:40.383 "block_size": 512, 00:11:40.383 "num_blocks": 262144, 00:11:40.383 "uuid": "7d6d34e7-2a41-451c-bd8c-7c05280d27a9", 00:11:40.383 "assigned_rate_limits": { 00:11:40.383 "rw_ios_per_sec": 0, 00:11:40.383 "rw_mbytes_per_sec": 0, 00:11:40.383 "r_mbytes_per_sec": 0, 00:11:40.383 "w_mbytes_per_sec": 0 00:11:40.383 }, 00:11:40.383 "claimed": false, 00:11:40.383 "zoned": false, 00:11:40.383 "supported_io_types": { 00:11:40.383 "read": true, 00:11:40.383 "write": true, 00:11:40.383 "unmap": true, 00:11:40.383 "flush": true, 00:11:40.383 "reset": true, 00:11:40.383 "nvme_admin": false, 00:11:40.383 "nvme_io": false, 00:11:40.383 "nvme_io_md": false, 00:11:40.383 "write_zeroes": true, 00:11:40.383 "zcopy": true, 00:11:40.383 "get_zone_info": false, 00:11:40.383 "zone_management": false, 00:11:40.383 "zone_append": false, 00:11:40.383 "compare": false, 00:11:40.383 "compare_and_write": false, 00:11:40.383 "abort": true, 00:11:40.383 "seek_hole": false, 00:11:40.383 "seek_data": false, 00:11:40.383 "copy": true, 00:11:40.383 "nvme_iov_md": false 00:11:40.383 }, 00:11:40.383 "memory_domains": [ 00:11:40.383 { 00:11:40.383 "dma_device_id": "system", 00:11:40.383 "dma_device_type": 1 00:11:40.383 }, 00:11:40.383 { 00:11:40.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:40.383 "dma_device_type": 2 00:11:40.383 } 00:11:40.383 ], 00:11:40.383 "driver_specific": {} 00:11:40.383 } 00:11:40.383 ] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2495037 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 2495037 00:11:40.383 15:48:00 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:40.383 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 2495037 00:11:40.383 Running I/O for 5 seconds... 00:11:40.383 task offset: 105784 on job bdev=EE_Dev_1 fails 00:11:40.384 00:11:40.384 Latency(us) 00:11:40.384 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:40.384 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:40.384 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:40.384 EE_Dev_1 : 0.00 27363.18 106.89 6218.91 0.00 397.27 150.45 712.07 00:11:40.384 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:40.384 Dev_2 : 0.00 17075.77 66.70 0.00 0.00 688.71 150.45 1272.91 00:11:40.384 =================================================================================================================== 00:11:40.384 Total : 44438.96 173.59 6218.91 0.00 555.34 150.45 1272.91 00:11:40.384 [2024-07-12 15:48:00.750217] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:40.384 request: 00:11:40.384 { 00:11:40.384 "method": "perform_tests", 00:11:40.384 "req_id": 1 00:11:40.384 } 00:11:40.384 Got JSON-RPC error response 00:11:40.384 response: 00:11:40.384 { 00:11:40.384 "code": -32603, 00:11:40.384 "message": "bdevperf failed with error Operation not permitted" 00:11:40.384 } 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:40.645 00:11:40.645 real 0m8.750s 00:11:40.645 user 0m9.126s 00:11:40.645 sys 0m0.717s 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.645 15:48:00 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:40.645 ************************************ 00:11:40.645 END TEST bdev_error 00:11:40.645 ************************************ 00:11:40.645 15:48:00 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:40.645 15:48:00 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:11:40.645 15:48:00 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:40.645 15:48:00 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.645 15:48:00 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:40.645 ************************************ 00:11:40.645 START TEST bdev_stat 00:11:40.645 ************************************ 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2495422 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2495422' 00:11:40.645 Process Bdev IO statistics testing pid: 2495422 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2495422 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 2495422 ']' 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:40.645 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:40.645 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:40.905 [2024-07-12 15:48:01.093365] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:40.905 [2024-07-12 15:48:01.093422] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2495422 ] 00:11:40.905 [2024-07-12 15:48:01.184877] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:40.906 [2024-07-12 15:48:01.280153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:40.906 [2024-07-12 15:48:01.280159] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:41.847 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:41.847 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:11:41.847 15:48:01 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:11:41.847 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:41.847 15:48:01 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:41.847 Malloc_STAT 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:41.847 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:41.847 [ 00:11:41.847 { 00:11:41.847 "name": "Malloc_STAT", 00:11:41.847 "aliases": [ 00:11:41.847 "d615595e-9d5a-46b8-837d-32f6549b2856" 00:11:41.847 ], 00:11:41.847 "product_name": "Malloc disk", 00:11:41.847 "block_size": 512, 00:11:41.847 "num_blocks": 262144, 00:11:41.847 "uuid": "d615595e-9d5a-46b8-837d-32f6549b2856", 00:11:41.847 "assigned_rate_limits": { 00:11:41.847 "rw_ios_per_sec": 0, 00:11:41.847 "rw_mbytes_per_sec": 0, 00:11:41.847 "r_mbytes_per_sec": 0, 00:11:41.847 "w_mbytes_per_sec": 0 00:11:41.847 }, 00:11:41.847 "claimed": false, 00:11:41.847 "zoned": false, 00:11:41.847 "supported_io_types": { 00:11:41.847 "read": true, 00:11:41.847 "write": true, 00:11:41.847 "unmap": true, 00:11:41.847 "flush": true, 00:11:41.847 "reset": true, 00:11:41.847 "nvme_admin": false, 00:11:41.847 "nvme_io": false, 00:11:41.847 "nvme_io_md": false, 00:11:41.847 "write_zeroes": true, 00:11:41.847 "zcopy": true, 00:11:41.848 "get_zone_info": false, 00:11:41.848 "zone_management": false, 00:11:41.848 "zone_append": false, 00:11:41.848 "compare": false, 00:11:41.848 "compare_and_write": false, 00:11:41.848 "abort": true, 00:11:41.848 "seek_hole": false, 00:11:41.848 "seek_data": false, 00:11:41.848 "copy": true, 00:11:41.848 "nvme_iov_md": false 00:11:41.848 }, 00:11:41.848 "memory_domains": [ 00:11:41.848 { 00:11:41.848 "dma_device_id": "system", 00:11:41.848 "dma_device_type": 1 00:11:41.848 }, 00:11:41.848 { 00:11:41.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:41.848 "dma_device_type": 2 00:11:41.848 } 00:11:41.848 ], 00:11:41.848 "driver_specific": {} 00:11:41.848 } 00:11:41.848 ] 00:11:41.848 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:41.848 15:48:02 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:11:41.848 15:48:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:11:41.848 15:48:02 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:41.848 Running I/O for 10 seconds... 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:11:43.765 "tick_rate": 2600000000, 00:11:43.765 "ticks": 11940856975699823, 00:11:43.765 "bdevs": [ 00:11:43.765 { 00:11:43.765 "name": "Malloc_STAT", 00:11:43.765 "bytes_read": 1105244672, 00:11:43.765 "num_read_ops": 269828, 00:11:43.765 "bytes_written": 0, 00:11:43.765 "num_write_ops": 0, 00:11:43.765 "bytes_unmapped": 0, 00:11:43.765 "num_unmap_ops": 0, 00:11:43.765 "bytes_copied": 0, 00:11:43.765 "num_copy_ops": 0, 00:11:43.765 "read_latency_ticks": 2541185584110, 00:11:43.765 "max_read_latency_ticks": 12981060, 00:11:43.765 "min_read_latency_ticks": 308254, 00:11:43.765 "write_latency_ticks": 0, 00:11:43.765 "max_write_latency_ticks": 0, 00:11:43.765 "min_write_latency_ticks": 0, 00:11:43.765 "unmap_latency_ticks": 0, 00:11:43.765 "max_unmap_latency_ticks": 0, 00:11:43.765 "min_unmap_latency_ticks": 0, 00:11:43.765 "copy_latency_ticks": 0, 00:11:43.765 "max_copy_latency_ticks": 0, 00:11:43.765 "min_copy_latency_ticks": 0, 00:11:43.765 "io_error": {} 00:11:43.765 } 00:11:43.765 ] 00:11:43.765 }' 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=269828 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:11:43.765 "tick_rate": 2600000000, 00:11:43.765 "ticks": 11940857155435395, 00:11:43.765 "name": "Malloc_STAT", 00:11:43.765 "channels": [ 00:11:43.765 { 00:11:43.765 "thread_id": 2, 00:11:43.765 "bytes_read": 602931200, 00:11:43.765 "num_read_ops": 147200, 00:11:43.765 "bytes_written": 0, 00:11:43.765 "num_write_ops": 0, 00:11:43.765 "bytes_unmapped": 0, 00:11:43.765 "num_unmap_ops": 0, 00:11:43.765 "bytes_copied": 0, 00:11:43.765 "num_copy_ops": 0, 00:11:43.765 "read_latency_ticks": 1316058450020, 00:11:43.765 "max_read_latency_ticks": 9868590, 00:11:43.765 "min_read_latency_ticks": 7325834, 00:11:43.765 "write_latency_ticks": 0, 00:11:43.765 "max_write_latency_ticks": 0, 00:11:43.765 "min_write_latency_ticks": 0, 00:11:43.765 "unmap_latency_ticks": 0, 00:11:43.765 "max_unmap_latency_ticks": 0, 00:11:43.765 "min_unmap_latency_ticks": 0, 00:11:43.765 "copy_latency_ticks": 0, 00:11:43.765 "max_copy_latency_ticks": 0, 00:11:43.765 "min_copy_latency_ticks": 0 00:11:43.765 }, 00:11:43.765 { 00:11:43.765 "thread_id": 3, 00:11:43.765 "bytes_read": 542113792, 00:11:43.765 "num_read_ops": 132352, 00:11:43.765 "bytes_written": 0, 00:11:43.765 "num_write_ops": 0, 00:11:43.765 "bytes_unmapped": 0, 00:11:43.765 "num_unmap_ops": 0, 00:11:43.765 "bytes_copied": 0, 00:11:43.765 "num_copy_ops": 0, 00:11:43.765 "read_latency_ticks": 1317440189578, 00:11:43.765 "max_read_latency_ticks": 12981060, 00:11:43.765 "min_read_latency_ticks": 8177620, 00:11:43.765 "write_latency_ticks": 0, 00:11:43.765 "max_write_latency_ticks": 0, 00:11:43.765 "min_write_latency_ticks": 0, 00:11:43.765 "unmap_latency_ticks": 0, 00:11:43.765 "max_unmap_latency_ticks": 0, 00:11:43.765 "min_unmap_latency_ticks": 0, 00:11:43.765 "copy_latency_ticks": 0, 00:11:43.765 "max_copy_latency_ticks": 0, 00:11:43.765 "min_copy_latency_ticks": 0 00:11:43.765 } 00:11:43.765 ] 00:11:43.765 }' 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=147200 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=147200 00:11:43.765 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=132352 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=279552 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:44.026 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:11:44.027 "tick_rate": 2600000000, 00:11:44.027 "ticks": 11940857460670611, 00:11:44.027 "bdevs": [ 00:11:44.027 { 00:11:44.027 "name": "Malloc_STAT", 00:11:44.027 "bytes_read": 1213248000, 00:11:44.027 "num_read_ops": 296196, 00:11:44.027 "bytes_written": 0, 00:11:44.027 "num_write_ops": 0, 00:11:44.027 "bytes_unmapped": 0, 00:11:44.027 "num_unmap_ops": 0, 00:11:44.027 "bytes_copied": 0, 00:11:44.027 "num_copy_ops": 0, 00:11:44.027 "read_latency_ticks": 2790495844960, 00:11:44.027 "max_read_latency_ticks": 12981060, 00:11:44.027 "min_read_latency_ticks": 308254, 00:11:44.027 "write_latency_ticks": 0, 00:11:44.027 "max_write_latency_ticks": 0, 00:11:44.027 "min_write_latency_ticks": 0, 00:11:44.027 "unmap_latency_ticks": 0, 00:11:44.027 "max_unmap_latency_ticks": 0, 00:11:44.027 "min_unmap_latency_ticks": 0, 00:11:44.027 "copy_latency_ticks": 0, 00:11:44.027 "max_copy_latency_ticks": 0, 00:11:44.027 "min_copy_latency_ticks": 0, 00:11:44.027 "io_error": {} 00:11:44.027 } 00:11:44.027 ] 00:11:44.027 }' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=296196 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 279552 -lt 269828 ']' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 279552 -gt 296196 ']' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:44.027 00:11:44.027 Latency(us) 00:11:44.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:44.027 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:44.027 Malloc_STAT : 2.17 74310.39 290.27 0.00 0.00 3437.72 1039.75 3806.13 00:11:44.027 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:44.027 Malloc_STAT : 2.17 66807.85 260.97 0.00 0.00 3823.39 1027.15 5016.02 00:11:44.027 =================================================================================================================== 00:11:44.027 Total : 141118.24 551.24 0.00 0.00 3620.39 1027.15 5016.02 00:11:44.027 0 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2495422 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 2495422 ']' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 2495422 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2495422 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2495422' 00:11:44.027 killing process with pid 2495422 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 2495422 00:11:44.027 Received shutdown signal, test time was about 2.249272 seconds 00:11:44.027 00:11:44.027 Latency(us) 00:11:44.027 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:44.027 =================================================================================================================== 00:11:44.027 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:44.027 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 2495422 00:11:44.288 15:48:04 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:11:44.288 00:11:44.288 real 0m3.521s 00:11:44.288 user 0m7.084s 00:11:44.288 sys 0m0.421s 00:11:44.288 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:44.288 15:48:04 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:44.288 ************************************ 00:11:44.288 END TEST bdev_stat 00:11:44.288 ************************************ 00:11:44.288 15:48:04 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:11:44.288 15:48:04 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:11:44.288 00:11:44.288 real 1m48.594s 00:11:44.288 user 7m15.299s 00:11:44.288 sys 0m15.595s 00:11:44.288 15:48:04 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:44.288 15:48:04 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:44.288 ************************************ 00:11:44.288 END TEST blockdev_general 00:11:44.288 ************************************ 00:11:44.288 15:48:04 -- common/autotest_common.sh@1142 -- # return 0 00:11:44.288 15:48:04 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:44.288 15:48:04 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:44.288 15:48:04 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:44.288 15:48:04 -- common/autotest_common.sh@10 -- # set +x 00:11:44.288 ************************************ 00:11:44.288 START TEST bdev_raid 00:11:44.288 ************************************ 00:11:44.288 15:48:04 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:11:44.549 * Looking for test storage... 00:11:44.549 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:11:44.549 15:48:04 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:11:44.549 15:48:04 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:44.549 15:48:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:44.549 15:48:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:44.549 15:48:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:44.549 ************************************ 00:11:44.549 START TEST raid_function_test_raid0 00:11:44.549 ************************************ 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2496180 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2496180' 00:11:44.549 Process raid pid: 2496180 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2496180 /var/tmp/spdk-raid.sock 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 2496180 ']' 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:44.549 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:44.549 15:48:04 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:44.549 [2024-07-12 15:48:04.915361] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:44.549 [2024-07-12 15:48:04.915429] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:44.809 [2024-07-12 15:48:05.010381] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:44.809 [2024-07-12 15:48:05.105156] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:44.809 [2024-07-12 15:48:05.157595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:44.809 [2024-07-12 15:48:05.157628] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:45.380 15:48:05 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:45.640 [2024-07-12 15:48:06.001102] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:45.640 [2024-07-12 15:48:06.002436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:45.640 [2024-07-12 15:48:06.002506] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x24e3a50 00:11:45.640 [2024-07-12 15:48:06.002514] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:45.640 [2024-07-12 15:48:06.002691] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24e3990 00:11:45.640 [2024-07-12 15:48:06.002816] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24e3a50 00:11:45.641 [2024-07-12 15:48:06.002823] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x24e3a50 00:11:45.641 [2024-07-12 15:48:06.002911] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:45.641 Base_1 00:11:45.641 Base_2 00:11:45.641 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:45.641 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:45.641 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:45.901 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:45.901 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:45.901 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:45.901 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:45.901 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:45.902 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:46.163 [2024-07-12 15:48:06.430217] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2697760 00:11:46.163 /dev/nbd0 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:46.163 1+0 records in 00:11:46.163 1+0 records out 00:11:46.163 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000299658 s, 13.7 MB/s 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:46.163 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:46.424 { 00:11:46.424 "nbd_device": "/dev/nbd0", 00:11:46.424 "bdev_name": "raid" 00:11:46.424 } 00:11:46.424 ]' 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:46.424 { 00:11:46.424 "nbd_device": "/dev/nbd0", 00:11:46.424 "bdev_name": "raid" 00:11:46.424 } 00:11:46.424 ]' 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:46.424 4096+0 records in 00:11:46.424 4096+0 records out 00:11:46.424 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0266743 s, 78.6 MB/s 00:11:46.424 15:48:06 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:46.684 4096+0 records in 00:11:46.684 4096+0 records out 00:11:46.684 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.230574 s, 9.1 MB/s 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:46.684 128+0 records in 00:11:46.684 128+0 records out 00:11:46.684 65536 bytes (66 kB, 64 KiB) copied, 0.000121216 s, 541 MB/s 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:46.684 2035+0 records in 00:11:46.684 2035+0 records out 00:11:46.684 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00482217 s, 216 MB/s 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:46.684 456+0 records in 00:11:46.684 456+0 records out 00:11:46.684 233472 bytes (233 kB, 228 KiB) copied, 0.00114281 s, 204 MB/s 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:46.684 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:46.943 [2024-07-12 15:48:07.332705] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:46.943 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2496180 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 2496180 ']' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 2496180 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:47.203 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2496180 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2496180' 00:11:47.463 killing process with pid 2496180 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 2496180 00:11:47.463 [2024-07-12 15:48:07.670792] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:47.463 [2024-07-12 15:48:07.670858] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:47.463 [2024-07-12 15:48:07.670895] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:47.463 [2024-07-12 15:48:07.670904] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24e3a50 name raid, state offline 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 2496180 00:11:47.463 [2024-07-12 15:48:07.687182] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:47.463 00:11:47.463 real 0m2.975s 00:11:47.463 user 0m4.070s 00:11:47.463 sys 0m0.886s 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:47.463 15:48:07 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:47.463 ************************************ 00:11:47.463 END TEST raid_function_test_raid0 00:11:47.463 ************************************ 00:11:47.463 15:48:07 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:47.463 15:48:07 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:11:47.463 15:48:07 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:47.463 15:48:07 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:47.463 15:48:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:47.463 ************************************ 00:11:47.463 START TEST raid_function_test_concat 00:11:47.463 ************************************ 00:11:47.463 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2496688 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2496688' 00:11:47.723 Process raid pid: 2496688 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2496688 /var/tmp/spdk-raid.sock 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 2496688 ']' 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:47.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:47.723 15:48:07 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:47.723 [2024-07-12 15:48:07.965728] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:47.723 [2024-07-12 15:48:07.965798] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:47.723 [2024-07-12 15:48:08.058325] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:47.723 [2024-07-12 15:48:08.155088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:47.982 [2024-07-12 15:48:08.217565] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.982 [2024-07-12 15:48:08.217595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:48.551 15:48:08 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:48.810 [2024-07-12 15:48:09.047655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:48.810 [2024-07-12 15:48:09.048986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:48.810 [2024-07-12 15:48:09.049048] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9f4a50 00:11:48.810 [2024-07-12 15:48:09.049054] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:48.810 [2024-07-12 15:48:09.049226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9f4990 00:11:48.810 [2024-07-12 15:48:09.049332] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9f4a50 00:11:48.810 [2024-07-12 15:48:09.049338] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x9f4a50 00:11:48.810 [2024-07-12 15:48:09.049422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:48.810 Base_1 00:11:48.810 Base_2 00:11:48.810 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:48.810 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:48.810 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:49.070 [2024-07-12 15:48:09.480770] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xba8760 00:11:49.070 /dev/nbd0 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:11:49.070 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:49.395 1+0 records in 00:11:49.395 1+0 records out 00:11:49.395 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000340247 s, 12.0 MB/s 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:49.395 { 00:11:49.395 "nbd_device": "/dev/nbd0", 00:11:49.395 "bdev_name": "raid" 00:11:49.395 } 00:11:49.395 ]' 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:49.395 { 00:11:49.395 "nbd_device": "/dev/nbd0", 00:11:49.395 "bdev_name": "raid" 00:11:49.395 } 00:11:49.395 ]' 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:49.395 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:49.656 4096+0 records in 00:11:49.656 4096+0 records out 00:11:49.656 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0267873 s, 78.3 MB/s 00:11:49.656 15:48:09 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:49.656 4096+0 records in 00:11:49.656 4096+0 records out 00:11:49.656 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.199814 s, 10.5 MB/s 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:49.656 128+0 records in 00:11:49.656 128+0 records out 00:11:49.656 65536 bytes (66 kB, 64 KiB) copied, 0.000435002 s, 151 MB/s 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:49.656 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:49.917 2035+0 records in 00:11:49.917 2035+0 records out 00:11:49.917 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00527022 s, 198 MB/s 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:49.917 456+0 records in 00:11:49.917 456+0 records out 00:11:49.917 233472 bytes (233 kB, 228 KiB) copied, 0.000393735 s, 593 MB/s 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:49.917 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:50.178 [2024-07-12 15:48:10.366527] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:50.178 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2496688 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 2496688 ']' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 2496688 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2496688 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2496688' 00:11:50.437 killing process with pid 2496688 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 2496688 00:11:50.437 [2024-07-12 15:48:10.696530] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:50.437 [2024-07-12 15:48:10.696596] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:50.437 [2024-07-12 15:48:10.696631] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:50.437 [2024-07-12 15:48:10.696639] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9f4a50 name raid, state offline 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 2496688 00:11:50.437 [2024-07-12 15:48:10.712705] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:50.437 00:11:50.437 real 0m2.943s 00:11:50.437 user 0m4.018s 00:11:50.437 sys 0m0.922s 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:50.437 15:48:10 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:50.437 ************************************ 00:11:50.437 END TEST raid_function_test_concat 00:11:50.437 ************************************ 00:11:50.697 15:48:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:50.697 15:48:10 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:11:50.697 15:48:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:11:50.697 15:48:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:50.697 15:48:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:50.697 ************************************ 00:11:50.697 START TEST raid0_resize_test 00:11:50.697 ************************************ 00:11:50.697 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:11:50.697 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:11:50.697 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:11:50.697 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:11:50.697 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2497611 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2497611' 00:11:50.698 Process raid pid: 2497611 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2497611 /var/tmp/spdk-raid.sock 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 2497611 ']' 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:50.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:50.698 15:48:10 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.698 [2024-07-12 15:48:10.985810] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:50.698 [2024-07-12 15:48:10.985858] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:50.698 [2024-07-12 15:48:11.074528] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:50.698 [2024-07-12 15:48:11.139450] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:50.957 [2024-07-12 15:48:11.178527] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:50.957 [2024-07-12 15:48:11.178548] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:51.528 15:48:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:51.528 15:48:11 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:11:51.528 15:48:11 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:51.788 Base_1 00:11:51.788 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:51.788 Base_2 00:11:51.788 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:52.050 [2024-07-12 15:48:12.394526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:52.050 [2024-07-12 15:48:12.395655] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:52.050 [2024-07-12 15:48:12.395689] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1924600 00:11:52.050 [2024-07-12 15:48:12.395694] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:52.050 [2024-07-12 15:48:12.395852] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146f1d0 00:11:52.050 [2024-07-12 15:48:12.395921] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1924600 00:11:52.050 [2024-07-12 15:48:12.395926] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x1924600 00:11:52.050 [2024-07-12 15:48:12.395999] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:52.050 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:52.310 [2024-07-12 15:48:12.574961] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:52.310 [2024-07-12 15:48:12.574972] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:52.310 true 00:11:52.310 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:52.310 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:11:52.570 [2024-07-12 15:48:12.767570] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:52.570 [2024-07-12 15:48:12.959922] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:52.570 [2024-07-12 15:48:12.959933] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:52.570 [2024-07-12 15:48:12.959947] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:52.570 true 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:52.570 15:48:12 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:11:52.830 [2024-07-12 15:48:13.148520] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2497611 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 2497611 ']' 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 2497611 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2497611 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2497611' 00:11:52.830 killing process with pid 2497611 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 2497611 00:11:52.830 [2024-07-12 15:48:13.217096] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:52.830 [2024-07-12 15:48:13.217135] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:52.830 [2024-07-12 15:48:13.217164] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:52.830 [2024-07-12 15:48:13.217169] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1924600 name Raid, state offline 00:11:52.830 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 2497611 00:11:52.830 [2024-07-12 15:48:13.218072] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:53.090 15:48:13 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:11:53.090 00:11:53.090 real 0m2.395s 00:11:53.091 user 0m3.737s 00:11:53.091 sys 0m0.429s 00:11:53.091 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:53.091 15:48:13 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.091 ************************************ 00:11:53.091 END TEST raid0_resize_test 00:11:53.091 ************************************ 00:11:53.091 15:48:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:11:53.091 15:48:13 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:53.091 15:48:13 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:53.091 15:48:13 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:53.091 15:48:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:53.091 15:48:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:53.091 15:48:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:53.091 ************************************ 00:11:53.091 START TEST raid_state_function_test 00:11:53.091 ************************************ 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2498104 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2498104' 00:11:53.091 Process raid pid: 2498104 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2498104 /var/tmp/spdk-raid.sock 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2498104 ']' 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:53.091 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:53.091 15:48:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.091 [2024-07-12 15:48:13.441016] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:11:53.091 [2024-07-12 15:48:13.441069] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:53.091 [2024-07-12 15:48:13.532547] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:53.352 [2024-07-12 15:48:13.598989] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:53.352 [2024-07-12 15:48:13.648002] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:53.352 [2024-07-12 15:48:13.648024] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:53.921 15:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:53.921 15:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:11:53.921 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:54.182 [2024-07-12 15:48:14.475352] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:54.182 [2024-07-12 15:48:14.475381] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:54.182 [2024-07-12 15:48:14.475387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:54.182 [2024-07-12 15:48:14.475393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:54.182 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:54.441 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:54.441 "name": "Existed_Raid", 00:11:54.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.441 "strip_size_kb": 64, 00:11:54.441 "state": "configuring", 00:11:54.441 "raid_level": "raid0", 00:11:54.441 "superblock": false, 00:11:54.441 "num_base_bdevs": 2, 00:11:54.441 "num_base_bdevs_discovered": 0, 00:11:54.441 "num_base_bdevs_operational": 2, 00:11:54.441 "base_bdevs_list": [ 00:11:54.441 { 00:11:54.441 "name": "BaseBdev1", 00:11:54.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.441 "is_configured": false, 00:11:54.441 "data_offset": 0, 00:11:54.441 "data_size": 0 00:11:54.441 }, 00:11:54.441 { 00:11:54.441 "name": "BaseBdev2", 00:11:54.441 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:54.441 "is_configured": false, 00:11:54.441 "data_offset": 0, 00:11:54.441 "data_size": 0 00:11:54.441 } 00:11:54.441 ] 00:11:54.441 }' 00:11:54.441 15:48:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:54.441 15:48:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:55.011 15:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:55.011 [2024-07-12 15:48:15.389591] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:55.011 [2024-07-12 15:48:15.389607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c19900 name Existed_Raid, state configuring 00:11:55.011 15:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:55.271 [2024-07-12 15:48:15.574071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:55.271 [2024-07-12 15:48:15.574089] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:55.271 [2024-07-12 15:48:15.574094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:55.271 [2024-07-12 15:48:15.574099] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:55.271 15:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:55.532 [2024-07-12 15:48:15.764933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:55.532 BaseBdev1 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:55.532 15:48:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:55.793 [ 00:11:55.793 { 00:11:55.793 "name": "BaseBdev1", 00:11:55.793 "aliases": [ 00:11:55.793 "d9821284-e9e2-46c8-9976-c3296669d3d1" 00:11:55.793 ], 00:11:55.793 "product_name": "Malloc disk", 00:11:55.793 "block_size": 512, 00:11:55.793 "num_blocks": 65536, 00:11:55.793 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:55.793 "assigned_rate_limits": { 00:11:55.793 "rw_ios_per_sec": 0, 00:11:55.793 "rw_mbytes_per_sec": 0, 00:11:55.793 "r_mbytes_per_sec": 0, 00:11:55.793 "w_mbytes_per_sec": 0 00:11:55.793 }, 00:11:55.793 "claimed": true, 00:11:55.793 "claim_type": "exclusive_write", 00:11:55.793 "zoned": false, 00:11:55.793 "supported_io_types": { 00:11:55.793 "read": true, 00:11:55.793 "write": true, 00:11:55.793 "unmap": true, 00:11:55.793 "flush": true, 00:11:55.793 "reset": true, 00:11:55.793 "nvme_admin": false, 00:11:55.793 "nvme_io": false, 00:11:55.793 "nvme_io_md": false, 00:11:55.793 "write_zeroes": true, 00:11:55.793 "zcopy": true, 00:11:55.793 "get_zone_info": false, 00:11:55.793 "zone_management": false, 00:11:55.793 "zone_append": false, 00:11:55.793 "compare": false, 00:11:55.793 "compare_and_write": false, 00:11:55.793 "abort": true, 00:11:55.793 "seek_hole": false, 00:11:55.793 "seek_data": false, 00:11:55.793 "copy": true, 00:11:55.793 "nvme_iov_md": false 00:11:55.793 }, 00:11:55.793 "memory_domains": [ 00:11:55.793 { 00:11:55.793 "dma_device_id": "system", 00:11:55.793 "dma_device_type": 1 00:11:55.793 }, 00:11:55.793 { 00:11:55.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:55.793 "dma_device_type": 2 00:11:55.793 } 00:11:55.793 ], 00:11:55.793 "driver_specific": {} 00:11:55.793 } 00:11:55.793 ] 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.793 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:56.053 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:56.053 "name": "Existed_Raid", 00:11:56.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:56.053 "strip_size_kb": 64, 00:11:56.053 "state": "configuring", 00:11:56.053 "raid_level": "raid0", 00:11:56.053 "superblock": false, 00:11:56.053 "num_base_bdevs": 2, 00:11:56.053 "num_base_bdevs_discovered": 1, 00:11:56.053 "num_base_bdevs_operational": 2, 00:11:56.053 "base_bdevs_list": [ 00:11:56.053 { 00:11:56.053 "name": "BaseBdev1", 00:11:56.053 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:56.053 "is_configured": true, 00:11:56.053 "data_offset": 0, 00:11:56.053 "data_size": 65536 00:11:56.053 }, 00:11:56.053 { 00:11:56.053 "name": "BaseBdev2", 00:11:56.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:56.053 "is_configured": false, 00:11:56.053 "data_offset": 0, 00:11:56.053 "data_size": 0 00:11:56.053 } 00:11:56.053 ] 00:11:56.053 }' 00:11:56.053 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:56.053 15:48:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.623 15:48:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:56.623 [2024-07-12 15:48:17.040155] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:56.623 [2024-07-12 15:48:17.040180] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c191d0 name Existed_Raid, state configuring 00:11:56.623 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:56.884 [2024-07-12 15:48:17.232667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:56.884 [2024-07-12 15:48:17.233816] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:56.884 [2024-07-12 15:48:17.233840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:56.884 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.144 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.144 "name": "Existed_Raid", 00:11:57.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.144 "strip_size_kb": 64, 00:11:57.144 "state": "configuring", 00:11:57.144 "raid_level": "raid0", 00:11:57.144 "superblock": false, 00:11:57.144 "num_base_bdevs": 2, 00:11:57.144 "num_base_bdevs_discovered": 1, 00:11:57.144 "num_base_bdevs_operational": 2, 00:11:57.144 "base_bdevs_list": [ 00:11:57.144 { 00:11:57.144 "name": "BaseBdev1", 00:11:57.144 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:57.144 "is_configured": true, 00:11:57.144 "data_offset": 0, 00:11:57.144 "data_size": 65536 00:11:57.144 }, 00:11:57.144 { 00:11:57.144 "name": "BaseBdev2", 00:11:57.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.144 "is_configured": false, 00:11:57.144 "data_offset": 0, 00:11:57.144 "data_size": 0 00:11:57.144 } 00:11:57.144 ] 00:11:57.144 }' 00:11:57.144 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.144 15:48:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:57.714 15:48:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:57.714 [2024-07-12 15:48:18.143887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:57.714 [2024-07-12 15:48:18.143910] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c19e80 00:11:57.714 [2024-07-12 15:48:18.143914] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:57.714 [2024-07-12 15:48:18.144057] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1918290 00:11:57.715 [2024-07-12 15:48:18.144148] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c19e80 00:11:57.715 [2024-07-12 15:48:18.144154] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c19e80 00:11:57.715 [2024-07-12 15:48:18.144271] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:57.715 BaseBdev2 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:57.715 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:57.976 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:58.267 [ 00:11:58.267 { 00:11:58.267 "name": "BaseBdev2", 00:11:58.267 "aliases": [ 00:11:58.267 "49675317-3531-4351-828f-b2ba4e69733f" 00:11:58.267 ], 00:11:58.267 "product_name": "Malloc disk", 00:11:58.267 "block_size": 512, 00:11:58.267 "num_blocks": 65536, 00:11:58.267 "uuid": "49675317-3531-4351-828f-b2ba4e69733f", 00:11:58.267 "assigned_rate_limits": { 00:11:58.267 "rw_ios_per_sec": 0, 00:11:58.267 "rw_mbytes_per_sec": 0, 00:11:58.267 "r_mbytes_per_sec": 0, 00:11:58.267 "w_mbytes_per_sec": 0 00:11:58.267 }, 00:11:58.267 "claimed": true, 00:11:58.267 "claim_type": "exclusive_write", 00:11:58.267 "zoned": false, 00:11:58.267 "supported_io_types": { 00:11:58.267 "read": true, 00:11:58.267 "write": true, 00:11:58.267 "unmap": true, 00:11:58.267 "flush": true, 00:11:58.267 "reset": true, 00:11:58.267 "nvme_admin": false, 00:11:58.268 "nvme_io": false, 00:11:58.268 "nvme_io_md": false, 00:11:58.268 "write_zeroes": true, 00:11:58.268 "zcopy": true, 00:11:58.268 "get_zone_info": false, 00:11:58.268 "zone_management": false, 00:11:58.268 "zone_append": false, 00:11:58.268 "compare": false, 00:11:58.268 "compare_and_write": false, 00:11:58.268 "abort": true, 00:11:58.268 "seek_hole": false, 00:11:58.268 "seek_data": false, 00:11:58.268 "copy": true, 00:11:58.268 "nvme_iov_md": false 00:11:58.268 }, 00:11:58.268 "memory_domains": [ 00:11:58.268 { 00:11:58.268 "dma_device_id": "system", 00:11:58.268 "dma_device_type": 1 00:11:58.268 }, 00:11:58.268 { 00:11:58.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.268 "dma_device_type": 2 00:11:58.268 } 00:11:58.268 ], 00:11:58.268 "driver_specific": {} 00:11:58.268 } 00:11:58.268 ] 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.268 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.528 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.528 "name": "Existed_Raid", 00:11:58.528 "uuid": "4b927a36-f6c9-4f52-a723-4e468ef66136", 00:11:58.528 "strip_size_kb": 64, 00:11:58.528 "state": "online", 00:11:58.528 "raid_level": "raid0", 00:11:58.528 "superblock": false, 00:11:58.528 "num_base_bdevs": 2, 00:11:58.528 "num_base_bdevs_discovered": 2, 00:11:58.528 "num_base_bdevs_operational": 2, 00:11:58.528 "base_bdevs_list": [ 00:11:58.528 { 00:11:58.528 "name": "BaseBdev1", 00:11:58.528 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:58.528 "is_configured": true, 00:11:58.528 "data_offset": 0, 00:11:58.528 "data_size": 65536 00:11:58.528 }, 00:11:58.528 { 00:11:58.528 "name": "BaseBdev2", 00:11:58.528 "uuid": "49675317-3531-4351-828f-b2ba4e69733f", 00:11:58.528 "is_configured": true, 00:11:58.528 "data_offset": 0, 00:11:58.528 "data_size": 65536 00:11:58.528 } 00:11:58.528 ] 00:11:58.528 }' 00:11:58.528 15:48:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.528 15:48:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:59.098 [2024-07-12 15:48:19.451403] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:59.098 "name": "Existed_Raid", 00:11:59.098 "aliases": [ 00:11:59.098 "4b927a36-f6c9-4f52-a723-4e468ef66136" 00:11:59.098 ], 00:11:59.098 "product_name": "Raid Volume", 00:11:59.098 "block_size": 512, 00:11:59.098 "num_blocks": 131072, 00:11:59.098 "uuid": "4b927a36-f6c9-4f52-a723-4e468ef66136", 00:11:59.098 "assigned_rate_limits": { 00:11:59.098 "rw_ios_per_sec": 0, 00:11:59.098 "rw_mbytes_per_sec": 0, 00:11:59.098 "r_mbytes_per_sec": 0, 00:11:59.098 "w_mbytes_per_sec": 0 00:11:59.098 }, 00:11:59.098 "claimed": false, 00:11:59.098 "zoned": false, 00:11:59.098 "supported_io_types": { 00:11:59.098 "read": true, 00:11:59.098 "write": true, 00:11:59.098 "unmap": true, 00:11:59.098 "flush": true, 00:11:59.098 "reset": true, 00:11:59.098 "nvme_admin": false, 00:11:59.098 "nvme_io": false, 00:11:59.098 "nvme_io_md": false, 00:11:59.098 "write_zeroes": true, 00:11:59.098 "zcopy": false, 00:11:59.098 "get_zone_info": false, 00:11:59.098 "zone_management": false, 00:11:59.098 "zone_append": false, 00:11:59.098 "compare": false, 00:11:59.098 "compare_and_write": false, 00:11:59.098 "abort": false, 00:11:59.098 "seek_hole": false, 00:11:59.098 "seek_data": false, 00:11:59.098 "copy": false, 00:11:59.098 "nvme_iov_md": false 00:11:59.098 }, 00:11:59.098 "memory_domains": [ 00:11:59.098 { 00:11:59.098 "dma_device_id": "system", 00:11:59.098 "dma_device_type": 1 00:11:59.098 }, 00:11:59.098 { 00:11:59.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.098 "dma_device_type": 2 00:11:59.098 }, 00:11:59.098 { 00:11:59.098 "dma_device_id": "system", 00:11:59.098 "dma_device_type": 1 00:11:59.098 }, 00:11:59.098 { 00:11:59.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.098 "dma_device_type": 2 00:11:59.098 } 00:11:59.098 ], 00:11:59.098 "driver_specific": { 00:11:59.098 "raid": { 00:11:59.098 "uuid": "4b927a36-f6c9-4f52-a723-4e468ef66136", 00:11:59.098 "strip_size_kb": 64, 00:11:59.098 "state": "online", 00:11:59.098 "raid_level": "raid0", 00:11:59.098 "superblock": false, 00:11:59.098 "num_base_bdevs": 2, 00:11:59.098 "num_base_bdevs_discovered": 2, 00:11:59.098 "num_base_bdevs_operational": 2, 00:11:59.098 "base_bdevs_list": [ 00:11:59.098 { 00:11:59.098 "name": "BaseBdev1", 00:11:59.098 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:59.098 "is_configured": true, 00:11:59.098 "data_offset": 0, 00:11:59.098 "data_size": 65536 00:11:59.098 }, 00:11:59.098 { 00:11:59.098 "name": "BaseBdev2", 00:11:59.098 "uuid": "49675317-3531-4351-828f-b2ba4e69733f", 00:11:59.098 "is_configured": true, 00:11:59.098 "data_offset": 0, 00:11:59.098 "data_size": 65536 00:11:59.098 } 00:11:59.098 ] 00:11:59.098 } 00:11:59.098 } 00:11:59.098 }' 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:59.098 BaseBdev2' 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:59.098 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:59.358 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:59.358 "name": "BaseBdev1", 00:11:59.358 "aliases": [ 00:11:59.358 "d9821284-e9e2-46c8-9976-c3296669d3d1" 00:11:59.358 ], 00:11:59.358 "product_name": "Malloc disk", 00:11:59.358 "block_size": 512, 00:11:59.358 "num_blocks": 65536, 00:11:59.358 "uuid": "d9821284-e9e2-46c8-9976-c3296669d3d1", 00:11:59.358 "assigned_rate_limits": { 00:11:59.358 "rw_ios_per_sec": 0, 00:11:59.358 "rw_mbytes_per_sec": 0, 00:11:59.358 "r_mbytes_per_sec": 0, 00:11:59.358 "w_mbytes_per_sec": 0 00:11:59.358 }, 00:11:59.358 "claimed": true, 00:11:59.358 "claim_type": "exclusive_write", 00:11:59.358 "zoned": false, 00:11:59.358 "supported_io_types": { 00:11:59.358 "read": true, 00:11:59.358 "write": true, 00:11:59.358 "unmap": true, 00:11:59.358 "flush": true, 00:11:59.358 "reset": true, 00:11:59.358 "nvme_admin": false, 00:11:59.358 "nvme_io": false, 00:11:59.358 "nvme_io_md": false, 00:11:59.358 "write_zeroes": true, 00:11:59.358 "zcopy": true, 00:11:59.358 "get_zone_info": false, 00:11:59.358 "zone_management": false, 00:11:59.358 "zone_append": false, 00:11:59.358 "compare": false, 00:11:59.358 "compare_and_write": false, 00:11:59.358 "abort": true, 00:11:59.358 "seek_hole": false, 00:11:59.358 "seek_data": false, 00:11:59.358 "copy": true, 00:11:59.358 "nvme_iov_md": false 00:11:59.358 }, 00:11:59.358 "memory_domains": [ 00:11:59.358 { 00:11:59.358 "dma_device_id": "system", 00:11:59.358 "dma_device_type": 1 00:11:59.358 }, 00:11:59.358 { 00:11:59.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.358 "dma_device_type": 2 00:11:59.358 } 00:11:59.358 ], 00:11:59.358 "driver_specific": {} 00:11:59.358 }' 00:11:59.358 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.359 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.359 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:59.359 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:59.618 15:48:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.618 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:59.618 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:59.618 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:59.618 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:59.618 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:59.878 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:59.878 "name": "BaseBdev2", 00:11:59.878 "aliases": [ 00:11:59.878 "49675317-3531-4351-828f-b2ba4e69733f" 00:11:59.878 ], 00:11:59.878 "product_name": "Malloc disk", 00:11:59.878 "block_size": 512, 00:11:59.878 "num_blocks": 65536, 00:11:59.878 "uuid": "49675317-3531-4351-828f-b2ba4e69733f", 00:11:59.878 "assigned_rate_limits": { 00:11:59.878 "rw_ios_per_sec": 0, 00:11:59.878 "rw_mbytes_per_sec": 0, 00:11:59.878 "r_mbytes_per_sec": 0, 00:11:59.878 "w_mbytes_per_sec": 0 00:11:59.878 }, 00:11:59.878 "claimed": true, 00:11:59.878 "claim_type": "exclusive_write", 00:11:59.878 "zoned": false, 00:11:59.878 "supported_io_types": { 00:11:59.878 "read": true, 00:11:59.878 "write": true, 00:11:59.878 "unmap": true, 00:11:59.878 "flush": true, 00:11:59.878 "reset": true, 00:11:59.878 "nvme_admin": false, 00:11:59.878 "nvme_io": false, 00:11:59.878 "nvme_io_md": false, 00:11:59.878 "write_zeroes": true, 00:11:59.878 "zcopy": true, 00:11:59.878 "get_zone_info": false, 00:11:59.878 "zone_management": false, 00:11:59.878 "zone_append": false, 00:11:59.878 "compare": false, 00:11:59.878 "compare_and_write": false, 00:11:59.878 "abort": true, 00:11:59.878 "seek_hole": false, 00:11:59.878 "seek_data": false, 00:11:59.878 "copy": true, 00:11:59.878 "nvme_iov_md": false 00:11:59.878 }, 00:11:59.878 "memory_domains": [ 00:11:59.878 { 00:11:59.878 "dma_device_id": "system", 00:11:59.878 "dma_device_type": 1 00:11:59.878 }, 00:11:59.878 { 00:11:59.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.878 "dma_device_type": 2 00:11:59.878 } 00:11:59.878 ], 00:11:59.878 "driver_specific": {} 00:11:59.878 }' 00:11:59.878 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:59.878 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:00.137 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:00.137 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.137 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:00.137 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:00.137 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.138 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:00.138 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:00.138 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.138 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:00.397 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:00.397 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:00.397 [2024-07-12 15:48:20.774671] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:00.397 [2024-07-12 15:48:20.774694] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:00.398 [2024-07-12 15:48:20.774729] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.398 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.658 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.658 "name": "Existed_Raid", 00:12:00.658 "uuid": "4b927a36-f6c9-4f52-a723-4e468ef66136", 00:12:00.658 "strip_size_kb": 64, 00:12:00.658 "state": "offline", 00:12:00.658 "raid_level": "raid0", 00:12:00.658 "superblock": false, 00:12:00.658 "num_base_bdevs": 2, 00:12:00.658 "num_base_bdevs_discovered": 1, 00:12:00.658 "num_base_bdevs_operational": 1, 00:12:00.658 "base_bdevs_list": [ 00:12:00.658 { 00:12:00.658 "name": null, 00:12:00.658 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.658 "is_configured": false, 00:12:00.658 "data_offset": 0, 00:12:00.658 "data_size": 65536 00:12:00.658 }, 00:12:00.658 { 00:12:00.658 "name": "BaseBdev2", 00:12:00.658 "uuid": "49675317-3531-4351-828f-b2ba4e69733f", 00:12:00.658 "is_configured": true, 00:12:00.658 "data_offset": 0, 00:12:00.658 "data_size": 65536 00:12:00.658 } 00:12:00.658 ] 00:12:00.658 }' 00:12:00.658 15:48:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.658 15:48:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.227 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:01.227 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:01.227 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.227 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:01.487 [2024-07-12 15:48:21.857410] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:01.487 [2024-07-12 15:48:21.857445] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c19e80 name Existed_Raid, state offline 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.487 15:48:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:01.746 15:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:01.746 15:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:01.746 15:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2498104 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2498104 ']' 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2498104 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2498104 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2498104' 00:12:01.747 killing process with pid 2498104 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2498104 00:12:01.747 [2024-07-12 15:48:22.117992] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:01.747 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2498104 00:12:01.747 [2024-07-12 15:48:22.118584] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:02.007 00:12:02.007 real 0m8.844s 00:12:02.007 user 0m16.086s 00:12:02.007 sys 0m1.343s 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.007 ************************************ 00:12:02.007 END TEST raid_state_function_test 00:12:02.007 ************************************ 00:12:02.007 15:48:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:02.007 15:48:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:12:02.007 15:48:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:02.007 15:48:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:02.007 15:48:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:02.007 ************************************ 00:12:02.007 START TEST raid_state_function_test_sb 00:12:02.007 ************************************ 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:02.007 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2499862 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2499862' 00:12:02.008 Process raid pid: 2499862 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2499862 /var/tmp/spdk-raid.sock 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2499862 ']' 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:02.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:02.008 15:48:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:02.008 [2024-07-12 15:48:22.379478] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:02.008 [2024-07-12 15:48:22.379537] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:02.267 [2024-07-12 15:48:22.469289] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:02.267 [2024-07-12 15:48:22.537342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:02.268 [2024-07-12 15:48:22.588317] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:02.268 [2024-07-12 15:48:22.588342] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:02.836 15:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:02.836 15:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:02.836 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:03.095 [2024-07-12 15:48:23.388209] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:03.095 [2024-07-12 15:48:23.388237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:03.095 [2024-07-12 15:48:23.388243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:03.095 [2024-07-12 15:48:23.388249] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:03.095 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:03.354 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:03.354 "name": "Existed_Raid", 00:12:03.354 "uuid": "f0b09b6f-4299-4df6-900a-a6c801b32a19", 00:12:03.354 "strip_size_kb": 64, 00:12:03.354 "state": "configuring", 00:12:03.354 "raid_level": "raid0", 00:12:03.354 "superblock": true, 00:12:03.354 "num_base_bdevs": 2, 00:12:03.354 "num_base_bdevs_discovered": 0, 00:12:03.354 "num_base_bdevs_operational": 2, 00:12:03.354 "base_bdevs_list": [ 00:12:03.354 { 00:12:03.354 "name": "BaseBdev1", 00:12:03.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.354 "is_configured": false, 00:12:03.354 "data_offset": 0, 00:12:03.354 "data_size": 0 00:12:03.354 }, 00:12:03.354 { 00:12:03.354 "name": "BaseBdev2", 00:12:03.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:03.354 "is_configured": false, 00:12:03.354 "data_offset": 0, 00:12:03.354 "data_size": 0 00:12:03.354 } 00:12:03.354 ] 00:12:03.354 }' 00:12:03.354 15:48:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:03.354 15:48:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:03.924 15:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:03.924 [2024-07-12 15:48:24.278343] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:03.924 [2024-07-12 15:48:24.278359] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d65900 name Existed_Raid, state configuring 00:12:03.924 15:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:04.184 [2024-07-12 15:48:24.470861] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:04.184 [2024-07-12 15:48:24.470880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:04.184 [2024-07-12 15:48:24.470885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:04.184 [2024-07-12 15:48:24.470891] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:04.184 15:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:04.444 [2024-07-12 15:48:24.674046] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:04.444 BaseBdev1 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:04.444 15:48:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:04.704 [ 00:12:04.704 { 00:12:04.704 "name": "BaseBdev1", 00:12:04.704 "aliases": [ 00:12:04.704 "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6" 00:12:04.704 ], 00:12:04.704 "product_name": "Malloc disk", 00:12:04.704 "block_size": 512, 00:12:04.704 "num_blocks": 65536, 00:12:04.704 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:04.704 "assigned_rate_limits": { 00:12:04.704 "rw_ios_per_sec": 0, 00:12:04.704 "rw_mbytes_per_sec": 0, 00:12:04.704 "r_mbytes_per_sec": 0, 00:12:04.704 "w_mbytes_per_sec": 0 00:12:04.704 }, 00:12:04.704 "claimed": true, 00:12:04.704 "claim_type": "exclusive_write", 00:12:04.704 "zoned": false, 00:12:04.704 "supported_io_types": { 00:12:04.704 "read": true, 00:12:04.704 "write": true, 00:12:04.704 "unmap": true, 00:12:04.704 "flush": true, 00:12:04.704 "reset": true, 00:12:04.704 "nvme_admin": false, 00:12:04.704 "nvme_io": false, 00:12:04.704 "nvme_io_md": false, 00:12:04.704 "write_zeroes": true, 00:12:04.704 "zcopy": true, 00:12:04.704 "get_zone_info": false, 00:12:04.704 "zone_management": false, 00:12:04.704 "zone_append": false, 00:12:04.704 "compare": false, 00:12:04.704 "compare_and_write": false, 00:12:04.704 "abort": true, 00:12:04.704 "seek_hole": false, 00:12:04.704 "seek_data": false, 00:12:04.704 "copy": true, 00:12:04.704 "nvme_iov_md": false 00:12:04.704 }, 00:12:04.704 "memory_domains": [ 00:12:04.704 { 00:12:04.704 "dma_device_id": "system", 00:12:04.704 "dma_device_type": 1 00:12:04.704 }, 00:12:04.704 { 00:12:04.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.705 "dma_device_type": 2 00:12:04.705 } 00:12:04.705 ], 00:12:04.705 "driver_specific": {} 00:12:04.705 } 00:12:04.705 ] 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.705 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.965 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.965 "name": "Existed_Raid", 00:12:04.965 "uuid": "dc0c4f55-471d-49cd-89da-18f30e647d18", 00:12:04.965 "strip_size_kb": 64, 00:12:04.965 "state": "configuring", 00:12:04.965 "raid_level": "raid0", 00:12:04.965 "superblock": true, 00:12:04.965 "num_base_bdevs": 2, 00:12:04.965 "num_base_bdevs_discovered": 1, 00:12:04.965 "num_base_bdevs_operational": 2, 00:12:04.965 "base_bdevs_list": [ 00:12:04.965 { 00:12:04.965 "name": "BaseBdev1", 00:12:04.965 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:04.965 "is_configured": true, 00:12:04.965 "data_offset": 2048, 00:12:04.965 "data_size": 63488 00:12:04.965 }, 00:12:04.965 { 00:12:04.965 "name": "BaseBdev2", 00:12:04.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.965 "is_configured": false, 00:12:04.965 "data_offset": 0, 00:12:04.965 "data_size": 0 00:12:04.965 } 00:12:04.965 ] 00:12:04.965 }' 00:12:04.965 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.965 15:48:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:05.535 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:05.535 [2024-07-12 15:48:25.929276] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:05.535 [2024-07-12 15:48:25.929298] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d651d0 name Existed_Raid, state configuring 00:12:05.535 15:48:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:05.795 [2024-07-12 15:48:26.113776] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:05.795 [2024-07-12 15:48:26.114892] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:05.795 [2024-07-12 15:48:26.114913] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.795 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:06.056 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:06.056 "name": "Existed_Raid", 00:12:06.056 "uuid": "5e2e920c-44a6-4684-8a32-305596e4130b", 00:12:06.056 "strip_size_kb": 64, 00:12:06.056 "state": "configuring", 00:12:06.056 "raid_level": "raid0", 00:12:06.056 "superblock": true, 00:12:06.056 "num_base_bdevs": 2, 00:12:06.056 "num_base_bdevs_discovered": 1, 00:12:06.056 "num_base_bdevs_operational": 2, 00:12:06.056 "base_bdevs_list": [ 00:12:06.056 { 00:12:06.056 "name": "BaseBdev1", 00:12:06.056 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:06.056 "is_configured": true, 00:12:06.056 "data_offset": 2048, 00:12:06.056 "data_size": 63488 00:12:06.056 }, 00:12:06.056 { 00:12:06.056 "name": "BaseBdev2", 00:12:06.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:06.056 "is_configured": false, 00:12:06.056 "data_offset": 0, 00:12:06.056 "data_size": 0 00:12:06.056 } 00:12:06.056 ] 00:12:06.056 }' 00:12:06.056 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:06.056 15:48:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:06.626 15:48:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:06.626 [2024-07-12 15:48:26.996965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:06.626 [2024-07-12 15:48:26.997070] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d65e80 00:12:06.626 [2024-07-12 15:48:26.997077] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:06.626 [2024-07-12 15:48:26.997210] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a64290 00:12:06.626 [2024-07-12 15:48:26.997297] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d65e80 00:12:06.626 [2024-07-12 15:48:26.997302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d65e80 00:12:06.626 [2024-07-12 15:48:26.997367] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:06.626 BaseBdev2 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:06.626 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:06.886 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:07.147 [ 00:12:07.147 { 00:12:07.147 "name": "BaseBdev2", 00:12:07.147 "aliases": [ 00:12:07.147 "260f1d78-b91e-4da3-862e-f0d02940d881" 00:12:07.147 ], 00:12:07.147 "product_name": "Malloc disk", 00:12:07.147 "block_size": 512, 00:12:07.147 "num_blocks": 65536, 00:12:07.147 "uuid": "260f1d78-b91e-4da3-862e-f0d02940d881", 00:12:07.147 "assigned_rate_limits": { 00:12:07.147 "rw_ios_per_sec": 0, 00:12:07.147 "rw_mbytes_per_sec": 0, 00:12:07.147 "r_mbytes_per_sec": 0, 00:12:07.147 "w_mbytes_per_sec": 0 00:12:07.147 }, 00:12:07.147 "claimed": true, 00:12:07.147 "claim_type": "exclusive_write", 00:12:07.147 "zoned": false, 00:12:07.147 "supported_io_types": { 00:12:07.147 "read": true, 00:12:07.147 "write": true, 00:12:07.148 "unmap": true, 00:12:07.148 "flush": true, 00:12:07.148 "reset": true, 00:12:07.148 "nvme_admin": false, 00:12:07.148 "nvme_io": false, 00:12:07.148 "nvme_io_md": false, 00:12:07.148 "write_zeroes": true, 00:12:07.148 "zcopy": true, 00:12:07.148 "get_zone_info": false, 00:12:07.148 "zone_management": false, 00:12:07.148 "zone_append": false, 00:12:07.148 "compare": false, 00:12:07.148 "compare_and_write": false, 00:12:07.148 "abort": true, 00:12:07.148 "seek_hole": false, 00:12:07.148 "seek_data": false, 00:12:07.148 "copy": true, 00:12:07.148 "nvme_iov_md": false 00:12:07.148 }, 00:12:07.148 "memory_domains": [ 00:12:07.148 { 00:12:07.148 "dma_device_id": "system", 00:12:07.148 "dma_device_type": 1 00:12:07.148 }, 00:12:07.148 { 00:12:07.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.148 "dma_device_type": 2 00:12:07.148 } 00:12:07.148 ], 00:12:07.148 "driver_specific": {} 00:12:07.148 } 00:12:07.148 ] 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.148 "name": "Existed_Raid", 00:12:07.148 "uuid": "5e2e920c-44a6-4684-8a32-305596e4130b", 00:12:07.148 "strip_size_kb": 64, 00:12:07.148 "state": "online", 00:12:07.148 "raid_level": "raid0", 00:12:07.148 "superblock": true, 00:12:07.148 "num_base_bdevs": 2, 00:12:07.148 "num_base_bdevs_discovered": 2, 00:12:07.148 "num_base_bdevs_operational": 2, 00:12:07.148 "base_bdevs_list": [ 00:12:07.148 { 00:12:07.148 "name": "BaseBdev1", 00:12:07.148 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:07.148 "is_configured": true, 00:12:07.148 "data_offset": 2048, 00:12:07.148 "data_size": 63488 00:12:07.148 }, 00:12:07.148 { 00:12:07.148 "name": "BaseBdev2", 00:12:07.148 "uuid": "260f1d78-b91e-4da3-862e-f0d02940d881", 00:12:07.148 "is_configured": true, 00:12:07.148 "data_offset": 2048, 00:12:07.148 "data_size": 63488 00:12:07.148 } 00:12:07.148 ] 00:12:07.148 }' 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.148 15:48:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:07.718 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:07.979 [2024-07-12 15:48:28.292448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:07.979 "name": "Existed_Raid", 00:12:07.979 "aliases": [ 00:12:07.979 "5e2e920c-44a6-4684-8a32-305596e4130b" 00:12:07.979 ], 00:12:07.979 "product_name": "Raid Volume", 00:12:07.979 "block_size": 512, 00:12:07.979 "num_blocks": 126976, 00:12:07.979 "uuid": "5e2e920c-44a6-4684-8a32-305596e4130b", 00:12:07.979 "assigned_rate_limits": { 00:12:07.979 "rw_ios_per_sec": 0, 00:12:07.979 "rw_mbytes_per_sec": 0, 00:12:07.979 "r_mbytes_per_sec": 0, 00:12:07.979 "w_mbytes_per_sec": 0 00:12:07.979 }, 00:12:07.979 "claimed": false, 00:12:07.979 "zoned": false, 00:12:07.979 "supported_io_types": { 00:12:07.979 "read": true, 00:12:07.979 "write": true, 00:12:07.979 "unmap": true, 00:12:07.979 "flush": true, 00:12:07.979 "reset": true, 00:12:07.979 "nvme_admin": false, 00:12:07.979 "nvme_io": false, 00:12:07.979 "nvme_io_md": false, 00:12:07.979 "write_zeroes": true, 00:12:07.979 "zcopy": false, 00:12:07.979 "get_zone_info": false, 00:12:07.979 "zone_management": false, 00:12:07.979 "zone_append": false, 00:12:07.979 "compare": false, 00:12:07.979 "compare_and_write": false, 00:12:07.979 "abort": false, 00:12:07.979 "seek_hole": false, 00:12:07.979 "seek_data": false, 00:12:07.979 "copy": false, 00:12:07.979 "nvme_iov_md": false 00:12:07.979 }, 00:12:07.979 "memory_domains": [ 00:12:07.979 { 00:12:07.979 "dma_device_id": "system", 00:12:07.979 "dma_device_type": 1 00:12:07.979 }, 00:12:07.979 { 00:12:07.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.979 "dma_device_type": 2 00:12:07.979 }, 00:12:07.979 { 00:12:07.979 "dma_device_id": "system", 00:12:07.979 "dma_device_type": 1 00:12:07.979 }, 00:12:07.979 { 00:12:07.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.979 "dma_device_type": 2 00:12:07.979 } 00:12:07.979 ], 00:12:07.979 "driver_specific": { 00:12:07.979 "raid": { 00:12:07.979 "uuid": "5e2e920c-44a6-4684-8a32-305596e4130b", 00:12:07.979 "strip_size_kb": 64, 00:12:07.979 "state": "online", 00:12:07.979 "raid_level": "raid0", 00:12:07.979 "superblock": true, 00:12:07.979 "num_base_bdevs": 2, 00:12:07.979 "num_base_bdevs_discovered": 2, 00:12:07.979 "num_base_bdevs_operational": 2, 00:12:07.979 "base_bdevs_list": [ 00:12:07.979 { 00:12:07.979 "name": "BaseBdev1", 00:12:07.979 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:07.979 "is_configured": true, 00:12:07.979 "data_offset": 2048, 00:12:07.979 "data_size": 63488 00:12:07.979 }, 00:12:07.979 { 00:12:07.979 "name": "BaseBdev2", 00:12:07.979 "uuid": "260f1d78-b91e-4da3-862e-f0d02940d881", 00:12:07.979 "is_configured": true, 00:12:07.979 "data_offset": 2048, 00:12:07.979 "data_size": 63488 00:12:07.979 } 00:12:07.979 ] 00:12:07.979 } 00:12:07.979 } 00:12:07.979 }' 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:07.979 BaseBdev2' 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:07.979 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.240 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.240 "name": "BaseBdev1", 00:12:08.240 "aliases": [ 00:12:08.240 "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6" 00:12:08.240 ], 00:12:08.240 "product_name": "Malloc disk", 00:12:08.240 "block_size": 512, 00:12:08.240 "num_blocks": 65536, 00:12:08.240 "uuid": "f3db9ac3-be8d-4ba2-a6a0-025d57065ae6", 00:12:08.240 "assigned_rate_limits": { 00:12:08.240 "rw_ios_per_sec": 0, 00:12:08.240 "rw_mbytes_per_sec": 0, 00:12:08.240 "r_mbytes_per_sec": 0, 00:12:08.240 "w_mbytes_per_sec": 0 00:12:08.240 }, 00:12:08.240 "claimed": true, 00:12:08.240 "claim_type": "exclusive_write", 00:12:08.240 "zoned": false, 00:12:08.240 "supported_io_types": { 00:12:08.240 "read": true, 00:12:08.240 "write": true, 00:12:08.240 "unmap": true, 00:12:08.240 "flush": true, 00:12:08.240 "reset": true, 00:12:08.240 "nvme_admin": false, 00:12:08.240 "nvme_io": false, 00:12:08.240 "nvme_io_md": false, 00:12:08.240 "write_zeroes": true, 00:12:08.240 "zcopy": true, 00:12:08.240 "get_zone_info": false, 00:12:08.240 "zone_management": false, 00:12:08.240 "zone_append": false, 00:12:08.240 "compare": false, 00:12:08.240 "compare_and_write": false, 00:12:08.240 "abort": true, 00:12:08.240 "seek_hole": false, 00:12:08.240 "seek_data": false, 00:12:08.240 "copy": true, 00:12:08.240 "nvme_iov_md": false 00:12:08.240 }, 00:12:08.240 "memory_domains": [ 00:12:08.240 { 00:12:08.240 "dma_device_id": "system", 00:12:08.240 "dma_device_type": 1 00:12:08.240 }, 00:12:08.240 { 00:12:08.240 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.240 "dma_device_type": 2 00:12:08.240 } 00:12:08.240 ], 00:12:08.240 "driver_specific": {} 00:12:08.240 }' 00:12:08.240 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.240 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.240 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.240 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:08.500 15:48:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:08.759 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:08.759 "name": "BaseBdev2", 00:12:08.759 "aliases": [ 00:12:08.759 "260f1d78-b91e-4da3-862e-f0d02940d881" 00:12:08.759 ], 00:12:08.759 "product_name": "Malloc disk", 00:12:08.759 "block_size": 512, 00:12:08.759 "num_blocks": 65536, 00:12:08.759 "uuid": "260f1d78-b91e-4da3-862e-f0d02940d881", 00:12:08.759 "assigned_rate_limits": { 00:12:08.759 "rw_ios_per_sec": 0, 00:12:08.759 "rw_mbytes_per_sec": 0, 00:12:08.759 "r_mbytes_per_sec": 0, 00:12:08.759 "w_mbytes_per_sec": 0 00:12:08.759 }, 00:12:08.759 "claimed": true, 00:12:08.759 "claim_type": "exclusive_write", 00:12:08.759 "zoned": false, 00:12:08.759 "supported_io_types": { 00:12:08.759 "read": true, 00:12:08.759 "write": true, 00:12:08.759 "unmap": true, 00:12:08.759 "flush": true, 00:12:08.759 "reset": true, 00:12:08.759 "nvme_admin": false, 00:12:08.759 "nvme_io": false, 00:12:08.759 "nvme_io_md": false, 00:12:08.759 "write_zeroes": true, 00:12:08.759 "zcopy": true, 00:12:08.759 "get_zone_info": false, 00:12:08.759 "zone_management": false, 00:12:08.759 "zone_append": false, 00:12:08.759 "compare": false, 00:12:08.759 "compare_and_write": false, 00:12:08.759 "abort": true, 00:12:08.759 "seek_hole": false, 00:12:08.759 "seek_data": false, 00:12:08.759 "copy": true, 00:12:08.759 "nvme_iov_md": false 00:12:08.759 }, 00:12:08.759 "memory_domains": [ 00:12:08.760 { 00:12:08.760 "dma_device_id": "system", 00:12:08.760 "dma_device_type": 1 00:12:08.760 }, 00:12:08.760 { 00:12:08.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.760 "dma_device_type": 2 00:12:08.760 } 00:12:08.760 ], 00:12:08.760 "driver_specific": {} 00:12:08.760 }' 00:12:08.760 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.760 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:08.760 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:08.760 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:09.020 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:09.280 [2024-07-12 15:48:29.627648] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:09.281 [2024-07-12 15:48:29.627665] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:09.281 [2024-07-12 15:48:29.627700] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.281 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.541 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.541 "name": "Existed_Raid", 00:12:09.541 "uuid": "5e2e920c-44a6-4684-8a32-305596e4130b", 00:12:09.541 "strip_size_kb": 64, 00:12:09.541 "state": "offline", 00:12:09.541 "raid_level": "raid0", 00:12:09.541 "superblock": true, 00:12:09.541 "num_base_bdevs": 2, 00:12:09.541 "num_base_bdevs_discovered": 1, 00:12:09.541 "num_base_bdevs_operational": 1, 00:12:09.541 "base_bdevs_list": [ 00:12:09.541 { 00:12:09.541 "name": null, 00:12:09.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.541 "is_configured": false, 00:12:09.541 "data_offset": 2048, 00:12:09.541 "data_size": 63488 00:12:09.541 }, 00:12:09.541 { 00:12:09.541 "name": "BaseBdev2", 00:12:09.541 "uuid": "260f1d78-b91e-4da3-862e-f0d02940d881", 00:12:09.541 "is_configured": true, 00:12:09.541 "data_offset": 2048, 00:12:09.541 "data_size": 63488 00:12:09.541 } 00:12:09.541 ] 00:12:09.541 }' 00:12:09.541 15:48:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.541 15:48:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.109 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:10.109 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:10.109 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.109 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:10.370 [2024-07-12 15:48:30.754526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:10.370 [2024-07-12 15:48:30.754563] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d65e80 name Existed_Raid, state offline 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.370 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2499862 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2499862 ']' 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2499862 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:10.630 15:48:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2499862 00:12:10.630 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:10.630 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:10.630 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2499862' 00:12:10.630 killing process with pid 2499862 00:12:10.630 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2499862 00:12:10.630 [2024-07-12 15:48:31.004860] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:10.630 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2499862 00:12:10.630 [2024-07-12 15:48:31.005446] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:10.891 15:48:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:10.891 00:12:10.891 real 0m8.804s 00:12:10.891 user 0m16.005s 00:12:10.891 sys 0m1.340s 00:12:10.891 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:10.891 15:48:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.891 ************************************ 00:12:10.891 END TEST raid_state_function_test_sb 00:12:10.891 ************************************ 00:12:10.891 15:48:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:10.891 15:48:31 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:12:10.891 15:48:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:10.891 15:48:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:10.891 15:48:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:10.891 ************************************ 00:12:10.891 START TEST raid_superblock_test 00:12:10.891 ************************************ 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2501597 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2501597 /var/tmp/spdk-raid.sock 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2501597 ']' 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:10.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:10.891 15:48:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.891 [2024-07-12 15:48:31.260586] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:10.891 [2024-07-12 15:48:31.260632] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2501597 ] 00:12:11.152 [2024-07-12 15:48:31.348567] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:11.152 [2024-07-12 15:48:31.414076] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.152 [2024-07-12 15:48:31.455822] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.152 [2024-07-12 15:48:31.455847] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:11.723 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:11.984 malloc1 00:12:11.984 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:12.244 [2024-07-12 15:48:32.457791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:12.244 [2024-07-12 15:48:32.457828] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.244 [2024-07-12 15:48:32.457842] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11cfb50 00:12:12.244 [2024-07-12 15:48:32.457849] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.244 [2024-07-12 15:48:32.459109] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.244 [2024-07-12 15:48:32.459128] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:12.244 pt1 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:12.244 malloc2 00:12:12.244 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:12.504 [2024-07-12 15:48:32.828412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:12.504 [2024-07-12 15:48:32.828437] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:12.504 [2024-07-12 15:48:32.828445] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11d0df0 00:12:12.504 [2024-07-12 15:48:32.828452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:12.504 [2024-07-12 15:48:32.829585] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:12.504 [2024-07-12 15:48:32.829604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:12.504 pt2 00:12:12.504 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:12.504 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:12.504 15:48:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:12.764 [2024-07-12 15:48:33.020911] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:12.764 [2024-07-12 15:48:33.021868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:12.764 [2024-07-12 15:48:33.021972] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13760f0 00:12:12.764 [2024-07-12 15:48:33.021979] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:12.764 [2024-07-12 15:48:33.022116] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e6a40 00:12:12.765 [2024-07-12 15:48:33.022218] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13760f0 00:12:12.765 [2024-07-12 15:48:33.022224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13760f0 00:12:12.765 [2024-07-12 15:48:33.022293] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.765 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:13.025 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.025 "name": "raid_bdev1", 00:12:13.025 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:13.025 "strip_size_kb": 64, 00:12:13.025 "state": "online", 00:12:13.025 "raid_level": "raid0", 00:12:13.025 "superblock": true, 00:12:13.025 "num_base_bdevs": 2, 00:12:13.025 "num_base_bdevs_discovered": 2, 00:12:13.025 "num_base_bdevs_operational": 2, 00:12:13.025 "base_bdevs_list": [ 00:12:13.025 { 00:12:13.025 "name": "pt1", 00:12:13.025 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:13.025 "is_configured": true, 00:12:13.025 "data_offset": 2048, 00:12:13.025 "data_size": 63488 00:12:13.025 }, 00:12:13.025 { 00:12:13.025 "name": "pt2", 00:12:13.025 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:13.025 "is_configured": true, 00:12:13.025 "data_offset": 2048, 00:12:13.025 "data_size": 63488 00:12:13.025 } 00:12:13.025 ] 00:12:13.025 }' 00:12:13.025 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.025 15:48:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:13.595 [2024-07-12 15:48:33.975502] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:13.595 "name": "raid_bdev1", 00:12:13.595 "aliases": [ 00:12:13.595 "11eeb434-3d20-46ef-b1cc-7467d7e7ba64" 00:12:13.595 ], 00:12:13.595 "product_name": "Raid Volume", 00:12:13.595 "block_size": 512, 00:12:13.595 "num_blocks": 126976, 00:12:13.595 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:13.595 "assigned_rate_limits": { 00:12:13.595 "rw_ios_per_sec": 0, 00:12:13.595 "rw_mbytes_per_sec": 0, 00:12:13.595 "r_mbytes_per_sec": 0, 00:12:13.595 "w_mbytes_per_sec": 0 00:12:13.595 }, 00:12:13.595 "claimed": false, 00:12:13.595 "zoned": false, 00:12:13.595 "supported_io_types": { 00:12:13.595 "read": true, 00:12:13.595 "write": true, 00:12:13.595 "unmap": true, 00:12:13.595 "flush": true, 00:12:13.595 "reset": true, 00:12:13.595 "nvme_admin": false, 00:12:13.595 "nvme_io": false, 00:12:13.595 "nvme_io_md": false, 00:12:13.595 "write_zeroes": true, 00:12:13.595 "zcopy": false, 00:12:13.595 "get_zone_info": false, 00:12:13.595 "zone_management": false, 00:12:13.595 "zone_append": false, 00:12:13.595 "compare": false, 00:12:13.595 "compare_and_write": false, 00:12:13.595 "abort": false, 00:12:13.595 "seek_hole": false, 00:12:13.595 "seek_data": false, 00:12:13.595 "copy": false, 00:12:13.595 "nvme_iov_md": false 00:12:13.595 }, 00:12:13.595 "memory_domains": [ 00:12:13.595 { 00:12:13.595 "dma_device_id": "system", 00:12:13.595 "dma_device_type": 1 00:12:13.595 }, 00:12:13.595 { 00:12:13.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:13.595 "dma_device_type": 2 00:12:13.595 }, 00:12:13.595 { 00:12:13.595 "dma_device_id": "system", 00:12:13.595 "dma_device_type": 1 00:12:13.595 }, 00:12:13.595 { 00:12:13.595 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:13.595 "dma_device_type": 2 00:12:13.595 } 00:12:13.595 ], 00:12:13.595 "driver_specific": { 00:12:13.595 "raid": { 00:12:13.595 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:13.595 "strip_size_kb": 64, 00:12:13.595 "state": "online", 00:12:13.595 "raid_level": "raid0", 00:12:13.595 "superblock": true, 00:12:13.595 "num_base_bdevs": 2, 00:12:13.595 "num_base_bdevs_discovered": 2, 00:12:13.595 "num_base_bdevs_operational": 2, 00:12:13.595 "base_bdevs_list": [ 00:12:13.595 { 00:12:13.595 "name": "pt1", 00:12:13.595 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:13.595 "is_configured": true, 00:12:13.595 "data_offset": 2048, 00:12:13.595 "data_size": 63488 00:12:13.595 }, 00:12:13.595 { 00:12:13.595 "name": "pt2", 00:12:13.595 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:13.595 "is_configured": true, 00:12:13.595 "data_offset": 2048, 00:12:13.595 "data_size": 63488 00:12:13.595 } 00:12:13.595 ] 00:12:13.595 } 00:12:13.595 } 00:12:13.595 }' 00:12:13.595 15:48:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:13.856 pt2' 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:13.856 "name": "pt1", 00:12:13.856 "aliases": [ 00:12:13.856 "00000000-0000-0000-0000-000000000001" 00:12:13.856 ], 00:12:13.856 "product_name": "passthru", 00:12:13.856 "block_size": 512, 00:12:13.856 "num_blocks": 65536, 00:12:13.856 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:13.856 "assigned_rate_limits": { 00:12:13.856 "rw_ios_per_sec": 0, 00:12:13.856 "rw_mbytes_per_sec": 0, 00:12:13.856 "r_mbytes_per_sec": 0, 00:12:13.856 "w_mbytes_per_sec": 0 00:12:13.856 }, 00:12:13.856 "claimed": true, 00:12:13.856 "claim_type": "exclusive_write", 00:12:13.856 "zoned": false, 00:12:13.856 "supported_io_types": { 00:12:13.856 "read": true, 00:12:13.856 "write": true, 00:12:13.856 "unmap": true, 00:12:13.856 "flush": true, 00:12:13.856 "reset": true, 00:12:13.856 "nvme_admin": false, 00:12:13.856 "nvme_io": false, 00:12:13.856 "nvme_io_md": false, 00:12:13.856 "write_zeroes": true, 00:12:13.856 "zcopy": true, 00:12:13.856 "get_zone_info": false, 00:12:13.856 "zone_management": false, 00:12:13.856 "zone_append": false, 00:12:13.856 "compare": false, 00:12:13.856 "compare_and_write": false, 00:12:13.856 "abort": true, 00:12:13.856 "seek_hole": false, 00:12:13.856 "seek_data": false, 00:12:13.856 "copy": true, 00:12:13.856 "nvme_iov_md": false 00:12:13.856 }, 00:12:13.856 "memory_domains": [ 00:12:13.856 { 00:12:13.856 "dma_device_id": "system", 00:12:13.856 "dma_device_type": 1 00:12:13.856 }, 00:12:13.856 { 00:12:13.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:13.856 "dma_device_type": 2 00:12:13.856 } 00:12:13.856 ], 00:12:13.856 "driver_specific": { 00:12:13.856 "passthru": { 00:12:13.856 "name": "pt1", 00:12:13.856 "base_bdev_name": "malloc1" 00:12:13.856 } 00:12:13.856 } 00:12:13.856 }' 00:12:13.856 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:14.116 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.377 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.377 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:14.377 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:14.377 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:14.377 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:14.639 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:14.639 "name": "pt2", 00:12:14.639 "aliases": [ 00:12:14.639 "00000000-0000-0000-0000-000000000002" 00:12:14.639 ], 00:12:14.639 "product_name": "passthru", 00:12:14.639 "block_size": 512, 00:12:14.639 "num_blocks": 65536, 00:12:14.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:14.639 "assigned_rate_limits": { 00:12:14.639 "rw_ios_per_sec": 0, 00:12:14.639 "rw_mbytes_per_sec": 0, 00:12:14.639 "r_mbytes_per_sec": 0, 00:12:14.639 "w_mbytes_per_sec": 0 00:12:14.639 }, 00:12:14.639 "claimed": true, 00:12:14.639 "claim_type": "exclusive_write", 00:12:14.639 "zoned": false, 00:12:14.639 "supported_io_types": { 00:12:14.639 "read": true, 00:12:14.639 "write": true, 00:12:14.639 "unmap": true, 00:12:14.639 "flush": true, 00:12:14.639 "reset": true, 00:12:14.639 "nvme_admin": false, 00:12:14.639 "nvme_io": false, 00:12:14.639 "nvme_io_md": false, 00:12:14.639 "write_zeroes": true, 00:12:14.639 "zcopy": true, 00:12:14.639 "get_zone_info": false, 00:12:14.639 "zone_management": false, 00:12:14.639 "zone_append": false, 00:12:14.639 "compare": false, 00:12:14.639 "compare_and_write": false, 00:12:14.639 "abort": true, 00:12:14.639 "seek_hole": false, 00:12:14.639 "seek_data": false, 00:12:14.639 "copy": true, 00:12:14.639 "nvme_iov_md": false 00:12:14.639 }, 00:12:14.639 "memory_domains": [ 00:12:14.639 { 00:12:14.639 "dma_device_id": "system", 00:12:14.639 "dma_device_type": 1 00:12:14.639 }, 00:12:14.639 { 00:12:14.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:14.639 "dma_device_type": 2 00:12:14.639 } 00:12:14.639 ], 00:12:14.639 "driver_specific": { 00:12:14.639 "passthru": { 00:12:14.639 "name": "pt2", 00:12:14.639 "base_bdev_name": "malloc2" 00:12:14.639 } 00:12:14.639 } 00:12:14.639 }' 00:12:14.639 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.639 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:14.639 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:14.639 15:48:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.639 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:14.639 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:14.639 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:14.899 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:15.472 [2024-07-12 15:48:35.772042] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:15.472 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=11eeb434-3d20-46ef-b1cc-7467d7e7ba64 00:12:15.472 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 11eeb434-3d20-46ef-b1cc-7467d7e7ba64 ']' 00:12:15.472 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:15.731 [2024-07-12 15:48:35.980360] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:15.731 [2024-07-12 15:48:35.980371] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:15.731 [2024-07-12 15:48:35.980410] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:15.731 [2024-07-12 15:48:35.980441] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:15.731 [2024-07-12 15:48:35.980446] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13760f0 name raid_bdev1, state offline 00:12:15.731 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:15.731 15:48:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.731 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:15.732 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:15.732 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:15.732 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:15.992 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:15.992 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:16.251 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:16.251 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:16.510 15:48:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:17.081 [2024-07-12 15:48:37.259556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:17.081 [2024-07-12 15:48:37.260615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:17.081 [2024-07-12 15:48:37.260657] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:17.081 [2024-07-12 15:48:37.260684] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:17.081 [2024-07-12 15:48:37.260694] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:17.081 [2024-07-12 15:48:37.260700] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1374dd0 name raid_bdev1, state configuring 00:12:17.081 request: 00:12:17.081 { 00:12:17.081 "name": "raid_bdev1", 00:12:17.081 "raid_level": "raid0", 00:12:17.081 "base_bdevs": [ 00:12:17.081 "malloc1", 00:12:17.081 "malloc2" 00:12:17.081 ], 00:12:17.081 "strip_size_kb": 64, 00:12:17.081 "superblock": false, 00:12:17.081 "method": "bdev_raid_create", 00:12:17.081 "req_id": 1 00:12:17.081 } 00:12:17.081 Got JSON-RPC error response 00:12:17.081 response: 00:12:17.081 { 00:12:17.081 "code": -17, 00:12:17.081 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:17.081 } 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:17.081 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:17.373 [2024-07-12 15:48:37.644488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:17.373 [2024-07-12 15:48:37.644519] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:17.373 [2024-07-12 15:48:37.644530] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1375920 00:12:17.373 [2024-07-12 15:48:37.644536] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:17.373 [2024-07-12 15:48:37.645808] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:17.373 [2024-07-12 15:48:37.645827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:17.373 [2024-07-12 15:48:37.645874] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:17.373 [2024-07-12 15:48:37.645893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:17.373 pt1 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.373 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:17.655 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.655 "name": "raid_bdev1", 00:12:17.655 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:17.655 "strip_size_kb": 64, 00:12:17.655 "state": "configuring", 00:12:17.655 "raid_level": "raid0", 00:12:17.655 "superblock": true, 00:12:17.655 "num_base_bdevs": 2, 00:12:17.655 "num_base_bdevs_discovered": 1, 00:12:17.655 "num_base_bdevs_operational": 2, 00:12:17.655 "base_bdevs_list": [ 00:12:17.655 { 00:12:17.655 "name": "pt1", 00:12:17.655 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:17.655 "is_configured": true, 00:12:17.655 "data_offset": 2048, 00:12:17.655 "data_size": 63488 00:12:17.655 }, 00:12:17.655 { 00:12:17.655 "name": null, 00:12:17.655 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:17.655 "is_configured": false, 00:12:17.655 "data_offset": 2048, 00:12:17.655 "data_size": 63488 00:12:17.655 } 00:12:17.655 ] 00:12:17.655 }' 00:12:17.655 15:48:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.655 15:48:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:18.225 [2024-07-12 15:48:38.574846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:18.225 [2024-07-12 15:48:38.574875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:18.225 [2024-07-12 15:48:38.574885] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11cfd80 00:12:18.225 [2024-07-12 15:48:38.574891] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:18.225 [2024-07-12 15:48:38.575154] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:18.225 [2024-07-12 15:48:38.575165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:18.225 [2024-07-12 15:48:38.575205] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:18.225 [2024-07-12 15:48:38.575217] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:18.225 [2024-07-12 15:48:38.575291] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11ce990 00:12:18.225 [2024-07-12 15:48:38.575297] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:18.225 [2024-07-12 15:48:38.575432] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13751b0 00:12:18.225 [2024-07-12 15:48:38.575526] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11ce990 00:12:18.225 [2024-07-12 15:48:38.575531] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11ce990 00:12:18.225 [2024-07-12 15:48:38.575603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.225 pt2 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:18.225 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.226 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:18.485 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.485 "name": "raid_bdev1", 00:12:18.485 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:18.485 "strip_size_kb": 64, 00:12:18.485 "state": "online", 00:12:18.485 "raid_level": "raid0", 00:12:18.485 "superblock": true, 00:12:18.485 "num_base_bdevs": 2, 00:12:18.485 "num_base_bdevs_discovered": 2, 00:12:18.485 "num_base_bdevs_operational": 2, 00:12:18.485 "base_bdevs_list": [ 00:12:18.485 { 00:12:18.485 "name": "pt1", 00:12:18.485 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:18.485 "is_configured": true, 00:12:18.485 "data_offset": 2048, 00:12:18.485 "data_size": 63488 00:12:18.485 }, 00:12:18.485 { 00:12:18.485 "name": "pt2", 00:12:18.485 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:18.485 "is_configured": true, 00:12:18.485 "data_offset": 2048, 00:12:18.485 "data_size": 63488 00:12:18.485 } 00:12:18.485 ] 00:12:18.485 }' 00:12:18.485 15:48:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.485 15:48:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:19.054 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:19.313 [2024-07-12 15:48:39.629741] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:19.313 "name": "raid_bdev1", 00:12:19.313 "aliases": [ 00:12:19.313 "11eeb434-3d20-46ef-b1cc-7467d7e7ba64" 00:12:19.313 ], 00:12:19.313 "product_name": "Raid Volume", 00:12:19.313 "block_size": 512, 00:12:19.313 "num_blocks": 126976, 00:12:19.313 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:19.313 "assigned_rate_limits": { 00:12:19.313 "rw_ios_per_sec": 0, 00:12:19.313 "rw_mbytes_per_sec": 0, 00:12:19.313 "r_mbytes_per_sec": 0, 00:12:19.313 "w_mbytes_per_sec": 0 00:12:19.313 }, 00:12:19.313 "claimed": false, 00:12:19.313 "zoned": false, 00:12:19.313 "supported_io_types": { 00:12:19.313 "read": true, 00:12:19.313 "write": true, 00:12:19.313 "unmap": true, 00:12:19.313 "flush": true, 00:12:19.313 "reset": true, 00:12:19.313 "nvme_admin": false, 00:12:19.313 "nvme_io": false, 00:12:19.313 "nvme_io_md": false, 00:12:19.313 "write_zeroes": true, 00:12:19.313 "zcopy": false, 00:12:19.313 "get_zone_info": false, 00:12:19.313 "zone_management": false, 00:12:19.313 "zone_append": false, 00:12:19.313 "compare": false, 00:12:19.313 "compare_and_write": false, 00:12:19.313 "abort": false, 00:12:19.313 "seek_hole": false, 00:12:19.313 "seek_data": false, 00:12:19.313 "copy": false, 00:12:19.313 "nvme_iov_md": false 00:12:19.313 }, 00:12:19.313 "memory_domains": [ 00:12:19.313 { 00:12:19.313 "dma_device_id": "system", 00:12:19.313 "dma_device_type": 1 00:12:19.313 }, 00:12:19.313 { 00:12:19.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.313 "dma_device_type": 2 00:12:19.313 }, 00:12:19.313 { 00:12:19.313 "dma_device_id": "system", 00:12:19.313 "dma_device_type": 1 00:12:19.313 }, 00:12:19.313 { 00:12:19.313 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.313 "dma_device_type": 2 00:12:19.313 } 00:12:19.313 ], 00:12:19.313 "driver_specific": { 00:12:19.313 "raid": { 00:12:19.313 "uuid": "11eeb434-3d20-46ef-b1cc-7467d7e7ba64", 00:12:19.313 "strip_size_kb": 64, 00:12:19.313 "state": "online", 00:12:19.313 "raid_level": "raid0", 00:12:19.313 "superblock": true, 00:12:19.313 "num_base_bdevs": 2, 00:12:19.313 "num_base_bdevs_discovered": 2, 00:12:19.313 "num_base_bdevs_operational": 2, 00:12:19.313 "base_bdevs_list": [ 00:12:19.313 { 00:12:19.313 "name": "pt1", 00:12:19.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:19.313 "is_configured": true, 00:12:19.313 "data_offset": 2048, 00:12:19.313 "data_size": 63488 00:12:19.313 }, 00:12:19.313 { 00:12:19.313 "name": "pt2", 00:12:19.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:19.313 "is_configured": true, 00:12:19.313 "data_offset": 2048, 00:12:19.313 "data_size": 63488 00:12:19.313 } 00:12:19.313 ] 00:12:19.313 } 00:12:19.313 } 00:12:19.313 }' 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:19.313 pt2' 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:19.313 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.574 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:19.574 "name": "pt1", 00:12:19.574 "aliases": [ 00:12:19.574 "00000000-0000-0000-0000-000000000001" 00:12:19.574 ], 00:12:19.574 "product_name": "passthru", 00:12:19.574 "block_size": 512, 00:12:19.574 "num_blocks": 65536, 00:12:19.574 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:19.574 "assigned_rate_limits": { 00:12:19.574 "rw_ios_per_sec": 0, 00:12:19.574 "rw_mbytes_per_sec": 0, 00:12:19.574 "r_mbytes_per_sec": 0, 00:12:19.574 "w_mbytes_per_sec": 0 00:12:19.574 }, 00:12:19.574 "claimed": true, 00:12:19.574 "claim_type": "exclusive_write", 00:12:19.574 "zoned": false, 00:12:19.574 "supported_io_types": { 00:12:19.574 "read": true, 00:12:19.574 "write": true, 00:12:19.574 "unmap": true, 00:12:19.574 "flush": true, 00:12:19.574 "reset": true, 00:12:19.574 "nvme_admin": false, 00:12:19.574 "nvme_io": false, 00:12:19.574 "nvme_io_md": false, 00:12:19.574 "write_zeroes": true, 00:12:19.574 "zcopy": true, 00:12:19.574 "get_zone_info": false, 00:12:19.574 "zone_management": false, 00:12:19.574 "zone_append": false, 00:12:19.574 "compare": false, 00:12:19.574 "compare_and_write": false, 00:12:19.574 "abort": true, 00:12:19.574 "seek_hole": false, 00:12:19.574 "seek_data": false, 00:12:19.574 "copy": true, 00:12:19.574 "nvme_iov_md": false 00:12:19.574 }, 00:12:19.574 "memory_domains": [ 00:12:19.574 { 00:12:19.574 "dma_device_id": "system", 00:12:19.574 "dma_device_type": 1 00:12:19.574 }, 00:12:19.574 { 00:12:19.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.574 "dma_device_type": 2 00:12:19.574 } 00:12:19.574 ], 00:12:19.574 "driver_specific": { 00:12:19.574 "passthru": { 00:12:19.574 "name": "pt1", 00:12:19.574 "base_bdev_name": "malloc1" 00:12:19.574 } 00:12:19.574 } 00:12:19.574 }' 00:12:19.574 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.574 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:19.574 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:19.574 15:48:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.833 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.833 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:19.834 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:20.094 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:20.094 "name": "pt2", 00:12:20.094 "aliases": [ 00:12:20.094 "00000000-0000-0000-0000-000000000002" 00:12:20.094 ], 00:12:20.094 "product_name": "passthru", 00:12:20.094 "block_size": 512, 00:12:20.094 "num_blocks": 65536, 00:12:20.094 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:20.094 "assigned_rate_limits": { 00:12:20.094 "rw_ios_per_sec": 0, 00:12:20.094 "rw_mbytes_per_sec": 0, 00:12:20.094 "r_mbytes_per_sec": 0, 00:12:20.094 "w_mbytes_per_sec": 0 00:12:20.094 }, 00:12:20.094 "claimed": true, 00:12:20.094 "claim_type": "exclusive_write", 00:12:20.094 "zoned": false, 00:12:20.094 "supported_io_types": { 00:12:20.094 "read": true, 00:12:20.094 "write": true, 00:12:20.094 "unmap": true, 00:12:20.094 "flush": true, 00:12:20.094 "reset": true, 00:12:20.094 "nvme_admin": false, 00:12:20.094 "nvme_io": false, 00:12:20.094 "nvme_io_md": false, 00:12:20.094 "write_zeroes": true, 00:12:20.094 "zcopy": true, 00:12:20.094 "get_zone_info": false, 00:12:20.094 "zone_management": false, 00:12:20.094 "zone_append": false, 00:12:20.094 "compare": false, 00:12:20.094 "compare_and_write": false, 00:12:20.094 "abort": true, 00:12:20.094 "seek_hole": false, 00:12:20.094 "seek_data": false, 00:12:20.094 "copy": true, 00:12:20.094 "nvme_iov_md": false 00:12:20.094 }, 00:12:20.094 "memory_domains": [ 00:12:20.094 { 00:12:20.094 "dma_device_id": "system", 00:12:20.094 "dma_device_type": 1 00:12:20.094 }, 00:12:20.094 { 00:12:20.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.094 "dma_device_type": 2 00:12:20.094 } 00:12:20.094 ], 00:12:20.094 "driver_specific": { 00:12:20.094 "passthru": { 00:12:20.094 "name": "pt2", 00:12:20.094 "base_bdev_name": "malloc2" 00:12:20.094 } 00:12:20.094 } 00:12:20.094 }' 00:12:20.094 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.094 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:20.353 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:20.614 [2024-07-12 15:48:40.925011] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 11eeb434-3d20-46ef-b1cc-7467d7e7ba64 '!=' 11eeb434-3d20-46ef-b1cc-7467d7e7ba64 ']' 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2501597 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2501597 ']' 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2501597 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2501597 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2501597' 00:12:20.614 killing process with pid 2501597 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2501597 00:12:20.614 [2024-07-12 15:48:40.997693] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:20.614 [2024-07-12 15:48:40.997740] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:20.614 [2024-07-12 15:48:40.997772] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:20.614 [2024-07-12 15:48:40.997778] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11ce990 name raid_bdev1, state offline 00:12:20.614 15:48:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2501597 00:12:20.614 [2024-07-12 15:48:41.006854] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:20.875 15:48:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:20.875 00:12:20.875 real 0m9.936s 00:12:20.875 user 0m18.278s 00:12:20.875 sys 0m1.411s 00:12:20.875 15:48:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:20.875 15:48:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.875 ************************************ 00:12:20.875 END TEST raid_superblock_test 00:12:20.875 ************************************ 00:12:20.875 15:48:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:20.876 15:48:41 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:20.876 15:48:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:20.876 15:48:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:20.876 15:48:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:20.876 ************************************ 00:12:20.876 START TEST raid_read_error_test 00:12:20.876 ************************************ 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.5j6QAYM1PN 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2503546 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2503546 /var/tmp/spdk-raid.sock 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2503546 ']' 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:20.876 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:20.876 15:48:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.876 [2024-07-12 15:48:41.285615] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:20.876 [2024-07-12 15:48:41.285674] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2503546 ] 00:12:21.136 [2024-07-12 15:48:41.378103] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:21.136 [2024-07-12 15:48:41.454398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:21.136 [2024-07-12 15:48:41.494812] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:21.136 [2024-07-12 15:48:41.494840] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:21.706 15:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:21.706 15:48:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:21.706 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:21.706 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:21.966 BaseBdev1_malloc 00:12:21.966 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:22.226 true 00:12:22.226 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:22.226 [2024-07-12 15:48:42.597688] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:22.226 [2024-07-12 15:48:42.597723] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:22.226 [2024-07-12 15:48:42.597734] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b4aa0 00:12:22.226 [2024-07-12 15:48:42.597741] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:22.226 [2024-07-12 15:48:42.598954] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:22.226 [2024-07-12 15:48:42.598972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:22.226 BaseBdev1 00:12:22.226 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:22.226 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:22.485 BaseBdev2_malloc 00:12:22.485 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:22.745 true 00:12:22.745 15:48:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:22.745 [2024-07-12 15:48:43.176860] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:22.745 [2024-07-12 15:48:43.176891] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:22.745 [2024-07-12 15:48:43.176902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b9e40 00:12:22.745 [2024-07-12 15:48:43.176908] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:22.745 [2024-07-12 15:48:43.178065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:22.745 [2024-07-12 15:48:43.178083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:22.745 BaseBdev2 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:23.005 [2024-07-12 15:48:43.361339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:23.005 [2024-07-12 15:48:43.362310] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:23.005 [2024-07-12 15:48:43.362449] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14bb000 00:12:23.005 [2024-07-12 15:48:43.362458] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:23.005 [2024-07-12 15:48:43.362596] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1311ef0 00:12:23.005 [2024-07-12 15:48:43.362708] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14bb000 00:12:23.005 [2024-07-12 15:48:43.362720] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14bb000 00:12:23.005 [2024-07-12 15:48:43.362797] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.005 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:23.265 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:23.265 "name": "raid_bdev1", 00:12:23.265 "uuid": "91da2eda-5daa-4748-a506-a1aacb4c28c7", 00:12:23.265 "strip_size_kb": 64, 00:12:23.265 "state": "online", 00:12:23.265 "raid_level": "raid0", 00:12:23.265 "superblock": true, 00:12:23.265 "num_base_bdevs": 2, 00:12:23.265 "num_base_bdevs_discovered": 2, 00:12:23.265 "num_base_bdevs_operational": 2, 00:12:23.265 "base_bdevs_list": [ 00:12:23.265 { 00:12:23.265 "name": "BaseBdev1", 00:12:23.265 "uuid": "f907a359-1a34-5d18-a2b5-f9c172b5f5b2", 00:12:23.265 "is_configured": true, 00:12:23.265 "data_offset": 2048, 00:12:23.265 "data_size": 63488 00:12:23.265 }, 00:12:23.265 { 00:12:23.265 "name": "BaseBdev2", 00:12:23.265 "uuid": "f5346497-8170-58cb-ab79-2bcc23a557ce", 00:12:23.265 "is_configured": true, 00:12:23.265 "data_offset": 2048, 00:12:23.265 "data_size": 63488 00:12:23.265 } 00:12:23.265 ] 00:12:23.265 }' 00:12:23.265 15:48:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:23.265 15:48:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.833 15:48:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:23.833 15:48:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:23.833 [2024-07-12 15:48:44.159579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b6670 00:12:24.773 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:25.032 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.033 "name": "raid_bdev1", 00:12:25.033 "uuid": "91da2eda-5daa-4748-a506-a1aacb4c28c7", 00:12:25.033 "strip_size_kb": 64, 00:12:25.033 "state": "online", 00:12:25.033 "raid_level": "raid0", 00:12:25.033 "superblock": true, 00:12:25.033 "num_base_bdevs": 2, 00:12:25.033 "num_base_bdevs_discovered": 2, 00:12:25.033 "num_base_bdevs_operational": 2, 00:12:25.033 "base_bdevs_list": [ 00:12:25.033 { 00:12:25.033 "name": "BaseBdev1", 00:12:25.033 "uuid": "f907a359-1a34-5d18-a2b5-f9c172b5f5b2", 00:12:25.033 "is_configured": true, 00:12:25.033 "data_offset": 2048, 00:12:25.033 "data_size": 63488 00:12:25.033 }, 00:12:25.033 { 00:12:25.033 "name": "BaseBdev2", 00:12:25.033 "uuid": "f5346497-8170-58cb-ab79-2bcc23a557ce", 00:12:25.033 "is_configured": true, 00:12:25.033 "data_offset": 2048, 00:12:25.033 "data_size": 63488 00:12:25.033 } 00:12:25.033 ] 00:12:25.033 }' 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.033 15:48:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:25.602 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:25.866 [2024-07-12 15:48:46.184124] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:25.866 [2024-07-12 15:48:46.184155] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:25.866 [2024-07-12 15:48:46.186748] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:25.866 [2024-07-12 15:48:46.186768] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:25.866 [2024-07-12 15:48:46.186788] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:25.866 [2024-07-12 15:48:46.186794] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14bb000 name raid_bdev1, state offline 00:12:25.866 0 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2503546 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2503546 ']' 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2503546 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2503546 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2503546' 00:12:25.866 killing process with pid 2503546 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2503546 00:12:25.866 [2024-07-12 15:48:46.271410] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:25.866 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2503546 00:12:25.866 [2024-07-12 15:48:46.276926] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.5j6QAYM1PN 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:12:26.127 00:12:26.127 real 0m5.205s 00:12:26.127 user 0m8.143s 00:12:26.127 sys 0m0.739s 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:26.127 15:48:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.127 ************************************ 00:12:26.127 END TEST raid_read_error_test 00:12:26.127 ************************************ 00:12:26.127 15:48:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:26.127 15:48:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:26.127 15:48:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:26.127 15:48:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:26.127 15:48:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:26.127 ************************************ 00:12:26.127 START TEST raid_write_error_test 00:12:26.127 ************************************ 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.5KZGOCdWPE 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2504475 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2504475 /var/tmp/spdk-raid.sock 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2504475 ']' 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:26.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:26.127 15:48:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:26.127 [2024-07-12 15:48:46.552009] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:26.128 [2024-07-12 15:48:46.552064] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2504475 ] 00:12:26.388 [2024-07-12 15:48:46.644135] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:26.388 [2024-07-12 15:48:46.721172] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:26.388 [2024-07-12 15:48:46.773409] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:26.388 [2024-07-12 15:48:46.773433] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:26.959 15:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:26.959 15:48:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:26.959 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:26.959 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:27.219 BaseBdev1_malloc 00:12:27.219 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:27.480 true 00:12:27.480 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:27.741 [2024-07-12 15:48:47.932767] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:27.741 [2024-07-12 15:48:47.932799] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:27.741 [2024-07-12 15:48:47.932810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbcfaa0 00:12:27.741 [2024-07-12 15:48:47.932817] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:27.741 [2024-07-12 15:48:47.934084] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:27.741 [2024-07-12 15:48:47.934104] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:27.741 BaseBdev1 00:12:27.741 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:27.741 15:48:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:27.741 BaseBdev2_malloc 00:12:27.741 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:28.002 true 00:12:28.002 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:28.262 [2024-07-12 15:48:48.487742] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:28.262 [2024-07-12 15:48:48.487770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:28.262 [2024-07-12 15:48:48.487781] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbd4e40 00:12:28.262 [2024-07-12 15:48:48.487787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:28.262 [2024-07-12 15:48:48.488932] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:28.262 [2024-07-12 15:48:48.488951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:28.262 BaseBdev2 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:28.262 [2024-07-12 15:48:48.680250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:28.262 [2024-07-12 15:48:48.681233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:28.262 [2024-07-12 15:48:48.681370] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbd6000 00:12:28.262 [2024-07-12 15:48:48.681378] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:28.262 [2024-07-12 15:48:48.681517] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa2cef0 00:12:28.262 [2024-07-12 15:48:48.681627] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbd6000 00:12:28.262 [2024-07-12 15:48:48.681632] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbd6000 00:12:28.262 [2024-07-12 15:48:48.681707] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.262 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.522 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.522 "name": "raid_bdev1", 00:12:28.522 "uuid": "dd1a800f-f091-43ef-9c13-26ef8fb85352", 00:12:28.522 "strip_size_kb": 64, 00:12:28.522 "state": "online", 00:12:28.522 "raid_level": "raid0", 00:12:28.522 "superblock": true, 00:12:28.522 "num_base_bdevs": 2, 00:12:28.522 "num_base_bdevs_discovered": 2, 00:12:28.522 "num_base_bdevs_operational": 2, 00:12:28.522 "base_bdevs_list": [ 00:12:28.522 { 00:12:28.522 "name": "BaseBdev1", 00:12:28.522 "uuid": "eafbc6c7-b56d-549c-bf08-ca05a1fe0cfa", 00:12:28.522 "is_configured": true, 00:12:28.522 "data_offset": 2048, 00:12:28.522 "data_size": 63488 00:12:28.522 }, 00:12:28.522 { 00:12:28.522 "name": "BaseBdev2", 00:12:28.522 "uuid": "5697e54c-54e9-5a5f-af73-52e63c2ae2a7", 00:12:28.522 "is_configured": true, 00:12:28.522 "data_offset": 2048, 00:12:28.522 "data_size": 63488 00:12:28.522 } 00:12:28.522 ] 00:12:28.522 }' 00:12:28.522 15:48:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.522 15:48:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.092 15:48:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:29.092 15:48:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:29.092 [2024-07-12 15:48:49.530621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbd1670 00:12:30.033 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.293 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:30.553 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.553 "name": "raid_bdev1", 00:12:30.553 "uuid": "dd1a800f-f091-43ef-9c13-26ef8fb85352", 00:12:30.553 "strip_size_kb": 64, 00:12:30.553 "state": "online", 00:12:30.553 "raid_level": "raid0", 00:12:30.553 "superblock": true, 00:12:30.553 "num_base_bdevs": 2, 00:12:30.553 "num_base_bdevs_discovered": 2, 00:12:30.553 "num_base_bdevs_operational": 2, 00:12:30.553 "base_bdevs_list": [ 00:12:30.553 { 00:12:30.553 "name": "BaseBdev1", 00:12:30.553 "uuid": "eafbc6c7-b56d-549c-bf08-ca05a1fe0cfa", 00:12:30.553 "is_configured": true, 00:12:30.553 "data_offset": 2048, 00:12:30.553 "data_size": 63488 00:12:30.553 }, 00:12:30.553 { 00:12:30.553 "name": "BaseBdev2", 00:12:30.553 "uuid": "5697e54c-54e9-5a5f-af73-52e63c2ae2a7", 00:12:30.553 "is_configured": true, 00:12:30.553 "data_offset": 2048, 00:12:30.553 "data_size": 63488 00:12:30.553 } 00:12:30.553 ] 00:12:30.553 }' 00:12:30.553 15:48:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.553 15:48:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.124 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:31.124 [2024-07-12 15:48:51.562991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:31.124 [2024-07-12 15:48:51.563020] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:31.124 [2024-07-12 15:48:51.565626] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:31.124 [2024-07-12 15:48:51.565648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:31.124 [2024-07-12 15:48:51.565668] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:31.124 [2024-07-12 15:48:51.565673] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbd6000 name raid_bdev1, state offline 00:12:31.124 0 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2504475 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2504475 ']' 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2504475 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2504475 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2504475' 00:12:31.386 killing process with pid 2504475 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2504475 00:12:31.386 [2024-07-12 15:48:51.647626] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2504475 00:12:31.386 [2024-07-12 15:48:51.653436] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.5KZGOCdWPE 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:12:31.386 00:12:31.386 real 0m5.307s 00:12:31.386 user 0m8.354s 00:12:31.386 sys 0m0.730s 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:31.386 15:48:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.386 ************************************ 00:12:31.386 END TEST raid_write_error_test 00:12:31.386 ************************************ 00:12:31.386 15:48:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:31.386 15:48:51 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:31.386 15:48:51 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:31.386 15:48:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:31.386 15:48:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:31.386 15:48:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:31.646 ************************************ 00:12:31.646 START TEST raid_state_function_test 00:12:31.646 ************************************ 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.646 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2505409 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2505409' 00:12:31.647 Process raid pid: 2505409 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2505409 /var/tmp/spdk-raid.sock 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2505409 ']' 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:31.647 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:31.647 15:48:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:31.647 [2024-07-12 15:48:51.936730] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:31.647 [2024-07-12 15:48:51.936797] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:31.647 [2024-07-12 15:48:52.028604] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:31.908 [2024-07-12 15:48:52.097067] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:31.908 [2024-07-12 15:48:52.148837] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:31.908 [2024-07-12 15:48:52.148858] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:32.478 15:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:32.478 15:48:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:32.478 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:32.478 [2024-07-12 15:48:52.920745] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:32.478 [2024-07-12 15:48:52.920772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:32.478 [2024-07-12 15:48:52.920778] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:32.478 [2024-07-12 15:48:52.920783] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:32.737 15:48:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:32.737 15:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:32.737 "name": "Existed_Raid", 00:12:32.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.737 "strip_size_kb": 64, 00:12:32.737 "state": "configuring", 00:12:32.737 "raid_level": "concat", 00:12:32.737 "superblock": false, 00:12:32.737 "num_base_bdevs": 2, 00:12:32.737 "num_base_bdevs_discovered": 0, 00:12:32.737 "num_base_bdevs_operational": 2, 00:12:32.737 "base_bdevs_list": [ 00:12:32.737 { 00:12:32.737 "name": "BaseBdev1", 00:12:32.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.737 "is_configured": false, 00:12:32.737 "data_offset": 0, 00:12:32.737 "data_size": 0 00:12:32.737 }, 00:12:32.737 { 00:12:32.737 "name": "BaseBdev2", 00:12:32.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:32.737 "is_configured": false, 00:12:32.737 "data_offset": 0, 00:12:32.737 "data_size": 0 00:12:32.737 } 00:12:32.737 ] 00:12:32.737 }' 00:12:32.737 15:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:32.737 15:48:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:33.306 15:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:33.565 [2024-07-12 15:48:53.850994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:33.566 [2024-07-12 15:48:53.851011] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe56900 name Existed_Raid, state configuring 00:12:33.566 15:48:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:33.826 [2024-07-12 15:48:54.039487] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:33.826 [2024-07-12 15:48:54.039506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:33.826 [2024-07-12 15:48:54.039511] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:33.826 [2024-07-12 15:48:54.039517] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:33.826 [2024-07-12 15:48:54.234589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:33.826 BaseBdev1 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:33.826 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:34.086 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:34.346 [ 00:12:34.346 { 00:12:34.346 "name": "BaseBdev1", 00:12:34.346 "aliases": [ 00:12:34.346 "9014551b-413d-4e44-87ee-1c9268195d7b" 00:12:34.346 ], 00:12:34.346 "product_name": "Malloc disk", 00:12:34.346 "block_size": 512, 00:12:34.346 "num_blocks": 65536, 00:12:34.346 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:34.346 "assigned_rate_limits": { 00:12:34.346 "rw_ios_per_sec": 0, 00:12:34.346 "rw_mbytes_per_sec": 0, 00:12:34.346 "r_mbytes_per_sec": 0, 00:12:34.346 "w_mbytes_per_sec": 0 00:12:34.346 }, 00:12:34.346 "claimed": true, 00:12:34.346 "claim_type": "exclusive_write", 00:12:34.346 "zoned": false, 00:12:34.346 "supported_io_types": { 00:12:34.346 "read": true, 00:12:34.346 "write": true, 00:12:34.346 "unmap": true, 00:12:34.346 "flush": true, 00:12:34.346 "reset": true, 00:12:34.346 "nvme_admin": false, 00:12:34.346 "nvme_io": false, 00:12:34.346 "nvme_io_md": false, 00:12:34.346 "write_zeroes": true, 00:12:34.346 "zcopy": true, 00:12:34.346 "get_zone_info": false, 00:12:34.346 "zone_management": false, 00:12:34.346 "zone_append": false, 00:12:34.346 "compare": false, 00:12:34.346 "compare_and_write": false, 00:12:34.346 "abort": true, 00:12:34.346 "seek_hole": false, 00:12:34.346 "seek_data": false, 00:12:34.346 "copy": true, 00:12:34.346 "nvme_iov_md": false 00:12:34.346 }, 00:12:34.346 "memory_domains": [ 00:12:34.346 { 00:12:34.346 "dma_device_id": "system", 00:12:34.346 "dma_device_type": 1 00:12:34.346 }, 00:12:34.346 { 00:12:34.346 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:34.346 "dma_device_type": 2 00:12:34.346 } 00:12:34.346 ], 00:12:34.346 "driver_specific": {} 00:12:34.346 } 00:12:34.346 ] 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.346 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.347 "name": "Existed_Raid", 00:12:34.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.347 "strip_size_kb": 64, 00:12:34.347 "state": "configuring", 00:12:34.347 "raid_level": "concat", 00:12:34.347 "superblock": false, 00:12:34.347 "num_base_bdevs": 2, 00:12:34.347 "num_base_bdevs_discovered": 1, 00:12:34.347 "num_base_bdevs_operational": 2, 00:12:34.347 "base_bdevs_list": [ 00:12:34.347 { 00:12:34.347 "name": "BaseBdev1", 00:12:34.347 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:34.347 "is_configured": true, 00:12:34.347 "data_offset": 0, 00:12:34.347 "data_size": 65536 00:12:34.347 }, 00:12:34.347 { 00:12:34.347 "name": "BaseBdev2", 00:12:34.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.347 "is_configured": false, 00:12:34.347 "data_offset": 0, 00:12:34.347 "data_size": 0 00:12:34.347 } 00:12:34.347 ] 00:12:34.347 }' 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.347 15:48:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.916 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:35.177 [2024-07-12 15:48:55.493762] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:35.177 [2024-07-12 15:48:55.493786] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe561d0 name Existed_Raid, state configuring 00:12:35.177 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:35.437 [2024-07-12 15:48:55.682263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:35.437 [2024-07-12 15:48:55.683397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.437 [2024-07-12 15:48:55.683420] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.437 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:35.699 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:35.699 "name": "Existed_Raid", 00:12:35.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.699 "strip_size_kb": 64, 00:12:35.699 "state": "configuring", 00:12:35.699 "raid_level": "concat", 00:12:35.699 "superblock": false, 00:12:35.699 "num_base_bdevs": 2, 00:12:35.699 "num_base_bdevs_discovered": 1, 00:12:35.699 "num_base_bdevs_operational": 2, 00:12:35.699 "base_bdevs_list": [ 00:12:35.699 { 00:12:35.699 "name": "BaseBdev1", 00:12:35.699 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:35.699 "is_configured": true, 00:12:35.699 "data_offset": 0, 00:12:35.699 "data_size": 65536 00:12:35.699 }, 00:12:35.699 { 00:12:35.699 "name": "BaseBdev2", 00:12:35.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:35.699 "is_configured": false, 00:12:35.699 "data_offset": 0, 00:12:35.699 "data_size": 0 00:12:35.699 } 00:12:35.699 ] 00:12:35.699 }' 00:12:35.699 15:48:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:35.699 15:48:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:36.269 [2024-07-12 15:48:56.605453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:36.269 [2024-07-12 15:48:56.605476] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe56e80 00:12:36.269 [2024-07-12 15:48:56.605480] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:36.269 [2024-07-12 15:48:56.605628] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb55290 00:12:36.269 [2024-07-12 15:48:56.605725] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe56e80 00:12:36.269 [2024-07-12 15:48:56.605731] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe56e80 00:12:36.269 [2024-07-12 15:48:56.605846] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:36.269 BaseBdev2 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:36.269 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:36.566 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:36.896 [ 00:12:36.896 { 00:12:36.896 "name": "BaseBdev2", 00:12:36.896 "aliases": [ 00:12:36.896 "5a88424a-70f4-4af9-8d4b-a147fbad475b" 00:12:36.896 ], 00:12:36.896 "product_name": "Malloc disk", 00:12:36.896 "block_size": 512, 00:12:36.896 "num_blocks": 65536, 00:12:36.896 "uuid": "5a88424a-70f4-4af9-8d4b-a147fbad475b", 00:12:36.896 "assigned_rate_limits": { 00:12:36.896 "rw_ios_per_sec": 0, 00:12:36.896 "rw_mbytes_per_sec": 0, 00:12:36.896 "r_mbytes_per_sec": 0, 00:12:36.896 "w_mbytes_per_sec": 0 00:12:36.896 }, 00:12:36.896 "claimed": true, 00:12:36.896 "claim_type": "exclusive_write", 00:12:36.896 "zoned": false, 00:12:36.896 "supported_io_types": { 00:12:36.896 "read": true, 00:12:36.896 "write": true, 00:12:36.896 "unmap": true, 00:12:36.896 "flush": true, 00:12:36.896 "reset": true, 00:12:36.896 "nvme_admin": false, 00:12:36.896 "nvme_io": false, 00:12:36.896 "nvme_io_md": false, 00:12:36.896 "write_zeroes": true, 00:12:36.896 "zcopy": true, 00:12:36.896 "get_zone_info": false, 00:12:36.896 "zone_management": false, 00:12:36.896 "zone_append": false, 00:12:36.896 "compare": false, 00:12:36.896 "compare_and_write": false, 00:12:36.896 "abort": true, 00:12:36.896 "seek_hole": false, 00:12:36.896 "seek_data": false, 00:12:36.896 "copy": true, 00:12:36.896 "nvme_iov_md": false 00:12:36.896 }, 00:12:36.896 "memory_domains": [ 00:12:36.896 { 00:12:36.896 "dma_device_id": "system", 00:12:36.896 "dma_device_type": 1 00:12:36.896 }, 00:12:36.896 { 00:12:36.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:36.896 "dma_device_type": 2 00:12:36.896 } 00:12:36.896 ], 00:12:36.896 "driver_specific": {} 00:12:36.896 } 00:12:36.896 ] 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.896 15:48:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.896 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.896 "name": "Existed_Raid", 00:12:36.896 "uuid": "cba6fb6c-0367-4b8d-b453-43e6bfa84acb", 00:12:36.896 "strip_size_kb": 64, 00:12:36.896 "state": "online", 00:12:36.896 "raid_level": "concat", 00:12:36.896 "superblock": false, 00:12:36.896 "num_base_bdevs": 2, 00:12:36.896 "num_base_bdevs_discovered": 2, 00:12:36.896 "num_base_bdevs_operational": 2, 00:12:36.896 "base_bdevs_list": [ 00:12:36.896 { 00:12:36.896 "name": "BaseBdev1", 00:12:36.896 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:36.896 "is_configured": true, 00:12:36.896 "data_offset": 0, 00:12:36.896 "data_size": 65536 00:12:36.896 }, 00:12:36.896 { 00:12:36.896 "name": "BaseBdev2", 00:12:36.896 "uuid": "5a88424a-70f4-4af9-8d4b-a147fbad475b", 00:12:36.896 "is_configured": true, 00:12:36.896 "data_offset": 0, 00:12:36.896 "data_size": 65536 00:12:36.896 } 00:12:36.896 ] 00:12:36.896 }' 00:12:36.896 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.896 15:48:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:37.468 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:37.468 [2024-07-12 15:48:57.908962] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:37.729 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:37.729 "name": "Existed_Raid", 00:12:37.729 "aliases": [ 00:12:37.729 "cba6fb6c-0367-4b8d-b453-43e6bfa84acb" 00:12:37.729 ], 00:12:37.729 "product_name": "Raid Volume", 00:12:37.729 "block_size": 512, 00:12:37.729 "num_blocks": 131072, 00:12:37.729 "uuid": "cba6fb6c-0367-4b8d-b453-43e6bfa84acb", 00:12:37.729 "assigned_rate_limits": { 00:12:37.729 "rw_ios_per_sec": 0, 00:12:37.729 "rw_mbytes_per_sec": 0, 00:12:37.729 "r_mbytes_per_sec": 0, 00:12:37.729 "w_mbytes_per_sec": 0 00:12:37.729 }, 00:12:37.729 "claimed": false, 00:12:37.729 "zoned": false, 00:12:37.729 "supported_io_types": { 00:12:37.729 "read": true, 00:12:37.729 "write": true, 00:12:37.729 "unmap": true, 00:12:37.729 "flush": true, 00:12:37.729 "reset": true, 00:12:37.729 "nvme_admin": false, 00:12:37.729 "nvme_io": false, 00:12:37.729 "nvme_io_md": false, 00:12:37.729 "write_zeroes": true, 00:12:37.729 "zcopy": false, 00:12:37.729 "get_zone_info": false, 00:12:37.729 "zone_management": false, 00:12:37.729 "zone_append": false, 00:12:37.729 "compare": false, 00:12:37.729 "compare_and_write": false, 00:12:37.729 "abort": false, 00:12:37.729 "seek_hole": false, 00:12:37.729 "seek_data": false, 00:12:37.729 "copy": false, 00:12:37.729 "nvme_iov_md": false 00:12:37.729 }, 00:12:37.729 "memory_domains": [ 00:12:37.729 { 00:12:37.729 "dma_device_id": "system", 00:12:37.729 "dma_device_type": 1 00:12:37.729 }, 00:12:37.729 { 00:12:37.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.729 "dma_device_type": 2 00:12:37.729 }, 00:12:37.729 { 00:12:37.730 "dma_device_id": "system", 00:12:37.730 "dma_device_type": 1 00:12:37.730 }, 00:12:37.730 { 00:12:37.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.730 "dma_device_type": 2 00:12:37.730 } 00:12:37.730 ], 00:12:37.730 "driver_specific": { 00:12:37.730 "raid": { 00:12:37.730 "uuid": "cba6fb6c-0367-4b8d-b453-43e6bfa84acb", 00:12:37.730 "strip_size_kb": 64, 00:12:37.730 "state": "online", 00:12:37.730 "raid_level": "concat", 00:12:37.730 "superblock": false, 00:12:37.730 "num_base_bdevs": 2, 00:12:37.730 "num_base_bdevs_discovered": 2, 00:12:37.730 "num_base_bdevs_operational": 2, 00:12:37.730 "base_bdevs_list": [ 00:12:37.730 { 00:12:37.730 "name": "BaseBdev1", 00:12:37.730 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:37.730 "is_configured": true, 00:12:37.730 "data_offset": 0, 00:12:37.730 "data_size": 65536 00:12:37.730 }, 00:12:37.730 { 00:12:37.730 "name": "BaseBdev2", 00:12:37.730 "uuid": "5a88424a-70f4-4af9-8d4b-a147fbad475b", 00:12:37.730 "is_configured": true, 00:12:37.730 "data_offset": 0, 00:12:37.730 "data_size": 65536 00:12:37.730 } 00:12:37.730 ] 00:12:37.730 } 00:12:37.730 } 00:12:37.730 }' 00:12:37.730 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:37.730 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:37.730 BaseBdev2' 00:12:37.730 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:37.730 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:37.730 15:48:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:37.730 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:37.730 "name": "BaseBdev1", 00:12:37.730 "aliases": [ 00:12:37.730 "9014551b-413d-4e44-87ee-1c9268195d7b" 00:12:37.730 ], 00:12:37.730 "product_name": "Malloc disk", 00:12:37.730 "block_size": 512, 00:12:37.730 "num_blocks": 65536, 00:12:37.730 "uuid": "9014551b-413d-4e44-87ee-1c9268195d7b", 00:12:37.730 "assigned_rate_limits": { 00:12:37.730 "rw_ios_per_sec": 0, 00:12:37.730 "rw_mbytes_per_sec": 0, 00:12:37.730 "r_mbytes_per_sec": 0, 00:12:37.730 "w_mbytes_per_sec": 0 00:12:37.730 }, 00:12:37.730 "claimed": true, 00:12:37.730 "claim_type": "exclusive_write", 00:12:37.730 "zoned": false, 00:12:37.730 "supported_io_types": { 00:12:37.730 "read": true, 00:12:37.730 "write": true, 00:12:37.730 "unmap": true, 00:12:37.730 "flush": true, 00:12:37.730 "reset": true, 00:12:37.730 "nvme_admin": false, 00:12:37.730 "nvme_io": false, 00:12:37.730 "nvme_io_md": false, 00:12:37.730 "write_zeroes": true, 00:12:37.730 "zcopy": true, 00:12:37.730 "get_zone_info": false, 00:12:37.730 "zone_management": false, 00:12:37.730 "zone_append": false, 00:12:37.730 "compare": false, 00:12:37.730 "compare_and_write": false, 00:12:37.730 "abort": true, 00:12:37.730 "seek_hole": false, 00:12:37.730 "seek_data": false, 00:12:37.730 "copy": true, 00:12:37.730 "nvme_iov_md": false 00:12:37.730 }, 00:12:37.730 "memory_domains": [ 00:12:37.730 { 00:12:37.730 "dma_device_id": "system", 00:12:37.730 "dma_device_type": 1 00:12:37.730 }, 00:12:37.730 { 00:12:37.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.730 "dma_device_type": 2 00:12:37.730 } 00:12:37.730 ], 00:12:37.730 "driver_specific": {} 00:12:37.730 }' 00:12:37.730 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:37.990 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.251 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.251 "name": "BaseBdev2", 00:12:38.251 "aliases": [ 00:12:38.251 "5a88424a-70f4-4af9-8d4b-a147fbad475b" 00:12:38.251 ], 00:12:38.251 "product_name": "Malloc disk", 00:12:38.251 "block_size": 512, 00:12:38.251 "num_blocks": 65536, 00:12:38.251 "uuid": "5a88424a-70f4-4af9-8d4b-a147fbad475b", 00:12:38.251 "assigned_rate_limits": { 00:12:38.251 "rw_ios_per_sec": 0, 00:12:38.251 "rw_mbytes_per_sec": 0, 00:12:38.251 "r_mbytes_per_sec": 0, 00:12:38.251 "w_mbytes_per_sec": 0 00:12:38.251 }, 00:12:38.251 "claimed": true, 00:12:38.251 "claim_type": "exclusive_write", 00:12:38.251 "zoned": false, 00:12:38.251 "supported_io_types": { 00:12:38.251 "read": true, 00:12:38.251 "write": true, 00:12:38.251 "unmap": true, 00:12:38.251 "flush": true, 00:12:38.251 "reset": true, 00:12:38.251 "nvme_admin": false, 00:12:38.251 "nvme_io": false, 00:12:38.251 "nvme_io_md": false, 00:12:38.251 "write_zeroes": true, 00:12:38.251 "zcopy": true, 00:12:38.251 "get_zone_info": false, 00:12:38.251 "zone_management": false, 00:12:38.251 "zone_append": false, 00:12:38.251 "compare": false, 00:12:38.251 "compare_and_write": false, 00:12:38.251 "abort": true, 00:12:38.251 "seek_hole": false, 00:12:38.251 "seek_data": false, 00:12:38.251 "copy": true, 00:12:38.251 "nvme_iov_md": false 00:12:38.251 }, 00:12:38.251 "memory_domains": [ 00:12:38.251 { 00:12:38.251 "dma_device_id": "system", 00:12:38.251 "dma_device_type": 1 00:12:38.251 }, 00:12:38.251 { 00:12:38.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.251 "dma_device_type": 2 00:12:38.251 } 00:12:38.251 ], 00:12:38.251 "driver_specific": {} 00:12:38.251 }' 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.512 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:38.775 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:38.775 15:48:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.775 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:38.775 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:38.775 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:39.035 [2024-07-12 15:48:59.232156] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:39.035 [2024-07-12 15:48:59.232173] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:39.035 [2024-07-12 15:48:59.232206] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:39.035 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.036 "name": "Existed_Raid", 00:12:39.036 "uuid": "cba6fb6c-0367-4b8d-b453-43e6bfa84acb", 00:12:39.036 "strip_size_kb": 64, 00:12:39.036 "state": "offline", 00:12:39.036 "raid_level": "concat", 00:12:39.036 "superblock": false, 00:12:39.036 "num_base_bdevs": 2, 00:12:39.036 "num_base_bdevs_discovered": 1, 00:12:39.036 "num_base_bdevs_operational": 1, 00:12:39.036 "base_bdevs_list": [ 00:12:39.036 { 00:12:39.036 "name": null, 00:12:39.036 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.036 "is_configured": false, 00:12:39.036 "data_offset": 0, 00:12:39.036 "data_size": 65536 00:12:39.036 }, 00:12:39.036 { 00:12:39.036 "name": "BaseBdev2", 00:12:39.036 "uuid": "5a88424a-70f4-4af9-8d4b-a147fbad475b", 00:12:39.036 "is_configured": true, 00:12:39.036 "data_offset": 0, 00:12:39.036 "data_size": 65536 00:12:39.036 } 00:12:39.036 ] 00:12:39.036 }' 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.036 15:48:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.606 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:39.606 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:39.606 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.606 15:48:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:39.865 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:39.865 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:39.865 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:40.124 [2024-07-12 15:49:00.367035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:40.124 [2024-07-12 15:49:00.367071] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe56e80 name Existed_Raid, state offline 00:12:40.124 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:40.124 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:40.124 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.124 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2505409 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2505409 ']' 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2505409 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:40.384 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2505409 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2505409' 00:12:40.385 killing process with pid 2505409 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2505409 00:12:40.385 [2024-07-12 15:49:00.630080] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2505409 00:12:40.385 [2024-07-12 15:49:00.630659] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:40.385 00:12:40.385 real 0m8.885s 00:12:40.385 user 0m16.151s 00:12:40.385 sys 0m1.348s 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:40.385 ************************************ 00:12:40.385 END TEST raid_state_function_test 00:12:40.385 ************************************ 00:12:40.385 15:49:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:40.385 15:49:00 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:40.385 15:49:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:40.385 15:49:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:40.385 15:49:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:40.385 ************************************ 00:12:40.385 START TEST raid_state_function_test_sb 00:12:40.385 ************************************ 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:40.385 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2507161 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2507161' 00:12:40.645 Process raid pid: 2507161 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2507161 /var/tmp/spdk-raid.sock 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2507161 ']' 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:40.645 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:40.645 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.645 [2024-07-12 15:49:00.900155] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:40.645 [2024-07-12 15:49:00.900210] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:40.645 [2024-07-12 15:49:00.991549] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:40.645 [2024-07-12 15:49:01.060091] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:40.907 [2024-07-12 15:49:01.115229] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:40.907 [2024-07-12 15:49:01.115254] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:41.481 [2024-07-12 15:49:01.886313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:41.481 [2024-07-12 15:49:01.886338] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:41.481 [2024-07-12 15:49:01.886344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:41.481 [2024-07-12 15:49:01.886350] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.481 15:49:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.742 15:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.742 "name": "Existed_Raid", 00:12:41.742 "uuid": "9ab96260-1356-4541-95a9-5fc12c94f32e", 00:12:41.742 "strip_size_kb": 64, 00:12:41.742 "state": "configuring", 00:12:41.742 "raid_level": "concat", 00:12:41.742 "superblock": true, 00:12:41.742 "num_base_bdevs": 2, 00:12:41.742 "num_base_bdevs_discovered": 0, 00:12:41.742 "num_base_bdevs_operational": 2, 00:12:41.742 "base_bdevs_list": [ 00:12:41.742 { 00:12:41.742 "name": "BaseBdev1", 00:12:41.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.742 "is_configured": false, 00:12:41.742 "data_offset": 0, 00:12:41.742 "data_size": 0 00:12:41.742 }, 00:12:41.742 { 00:12:41.742 "name": "BaseBdev2", 00:12:41.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:41.742 "is_configured": false, 00:12:41.742 "data_offset": 0, 00:12:41.742 "data_size": 0 00:12:41.742 } 00:12:41.742 ] 00:12:41.742 }' 00:12:41.742 15:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.742 15:49:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:42.314 15:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:42.575 [2024-07-12 15:49:02.800508] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:42.575 [2024-07-12 15:49:02.800528] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ab900 name Existed_Raid, state configuring 00:12:42.575 15:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:42.575 [2024-07-12 15:49:02.976979] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:42.575 [2024-07-12 15:49:02.976995] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:42.575 [2024-07-12 15:49:02.977000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:42.575 [2024-07-12 15:49:02.977006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:42.575 15:49:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:42.835 [2024-07-12 15:49:03.172148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:42.835 BaseBdev1 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:42.835 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:43.096 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:43.358 [ 00:12:43.358 { 00:12:43.358 "name": "BaseBdev1", 00:12:43.358 "aliases": [ 00:12:43.358 "be7ffd0f-3c20-4c38-a008-58712734c92b" 00:12:43.358 ], 00:12:43.358 "product_name": "Malloc disk", 00:12:43.358 "block_size": 512, 00:12:43.358 "num_blocks": 65536, 00:12:43.358 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:43.358 "assigned_rate_limits": { 00:12:43.358 "rw_ios_per_sec": 0, 00:12:43.358 "rw_mbytes_per_sec": 0, 00:12:43.358 "r_mbytes_per_sec": 0, 00:12:43.358 "w_mbytes_per_sec": 0 00:12:43.358 }, 00:12:43.358 "claimed": true, 00:12:43.358 "claim_type": "exclusive_write", 00:12:43.358 "zoned": false, 00:12:43.358 "supported_io_types": { 00:12:43.358 "read": true, 00:12:43.358 "write": true, 00:12:43.358 "unmap": true, 00:12:43.358 "flush": true, 00:12:43.358 "reset": true, 00:12:43.358 "nvme_admin": false, 00:12:43.358 "nvme_io": false, 00:12:43.358 "nvme_io_md": false, 00:12:43.358 "write_zeroes": true, 00:12:43.358 "zcopy": true, 00:12:43.358 "get_zone_info": false, 00:12:43.358 "zone_management": false, 00:12:43.358 "zone_append": false, 00:12:43.358 "compare": false, 00:12:43.358 "compare_and_write": false, 00:12:43.358 "abort": true, 00:12:43.358 "seek_hole": false, 00:12:43.358 "seek_data": false, 00:12:43.358 "copy": true, 00:12:43.358 "nvme_iov_md": false 00:12:43.358 }, 00:12:43.358 "memory_domains": [ 00:12:43.358 { 00:12:43.358 "dma_device_id": "system", 00:12:43.358 "dma_device_type": 1 00:12:43.358 }, 00:12:43.358 { 00:12:43.358 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:43.358 "dma_device_type": 2 00:12:43.358 } 00:12:43.358 ], 00:12:43.358 "driver_specific": {} 00:12:43.358 } 00:12:43.358 ] 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.358 "name": "Existed_Raid", 00:12:43.358 "uuid": "01118e4f-694a-4e45-a1da-0592e8695b49", 00:12:43.358 "strip_size_kb": 64, 00:12:43.358 "state": "configuring", 00:12:43.358 "raid_level": "concat", 00:12:43.358 "superblock": true, 00:12:43.358 "num_base_bdevs": 2, 00:12:43.358 "num_base_bdevs_discovered": 1, 00:12:43.358 "num_base_bdevs_operational": 2, 00:12:43.358 "base_bdevs_list": [ 00:12:43.358 { 00:12:43.358 "name": "BaseBdev1", 00:12:43.358 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:43.358 "is_configured": true, 00:12:43.358 "data_offset": 2048, 00:12:43.358 "data_size": 63488 00:12:43.358 }, 00:12:43.358 { 00:12:43.358 "name": "BaseBdev2", 00:12:43.358 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:43.358 "is_configured": false, 00:12:43.358 "data_offset": 0, 00:12:43.358 "data_size": 0 00:12:43.358 } 00:12:43.358 ] 00:12:43.358 }' 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.358 15:49:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:43.929 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:44.190 [2024-07-12 15:49:04.491481] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:44.190 [2024-07-12 15:49:04.491505] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ab1d0 name Existed_Raid, state configuring 00:12:44.190 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:44.450 [2024-07-12 15:49:04.679987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:44.450 [2024-07-12 15:49:04.681113] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:44.450 [2024-07-12 15:49:04.681135] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.450 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.709 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.709 "name": "Existed_Raid", 00:12:44.709 "uuid": "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0", 00:12:44.709 "strip_size_kb": 64, 00:12:44.709 "state": "configuring", 00:12:44.709 "raid_level": "concat", 00:12:44.709 "superblock": true, 00:12:44.709 "num_base_bdevs": 2, 00:12:44.709 "num_base_bdevs_discovered": 1, 00:12:44.709 "num_base_bdevs_operational": 2, 00:12:44.709 "base_bdevs_list": [ 00:12:44.709 { 00:12:44.709 "name": "BaseBdev1", 00:12:44.709 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:44.709 "is_configured": true, 00:12:44.709 "data_offset": 2048, 00:12:44.709 "data_size": 63488 00:12:44.709 }, 00:12:44.709 { 00:12:44.709 "name": "BaseBdev2", 00:12:44.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:44.709 "is_configured": false, 00:12:44.709 "data_offset": 0, 00:12:44.709 "data_size": 0 00:12:44.709 } 00:12:44.709 ] 00:12:44.709 }' 00:12:44.709 15:49:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.709 15:49:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:45.279 [2024-07-12 15:49:05.627284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:45.279 [2024-07-12 15:49:05.627392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18abe80 00:12:45.279 [2024-07-12 15:49:05.627400] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:45.279 [2024-07-12 15:49:05.627535] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15aa290 00:12:45.279 [2024-07-12 15:49:05.627621] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18abe80 00:12:45.279 [2024-07-12 15:49:05.627627] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x18abe80 00:12:45.279 [2024-07-12 15:49:05.627694] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:45.279 BaseBdev2 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:45.279 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:45.538 15:49:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:45.798 [ 00:12:45.798 { 00:12:45.798 "name": "BaseBdev2", 00:12:45.798 "aliases": [ 00:12:45.798 "53ac6a56-3519-4841-b17d-02bbe88311b6" 00:12:45.798 ], 00:12:45.798 "product_name": "Malloc disk", 00:12:45.798 "block_size": 512, 00:12:45.798 "num_blocks": 65536, 00:12:45.798 "uuid": "53ac6a56-3519-4841-b17d-02bbe88311b6", 00:12:45.798 "assigned_rate_limits": { 00:12:45.798 "rw_ios_per_sec": 0, 00:12:45.798 "rw_mbytes_per_sec": 0, 00:12:45.798 "r_mbytes_per_sec": 0, 00:12:45.798 "w_mbytes_per_sec": 0 00:12:45.798 }, 00:12:45.798 "claimed": true, 00:12:45.798 "claim_type": "exclusive_write", 00:12:45.798 "zoned": false, 00:12:45.798 "supported_io_types": { 00:12:45.798 "read": true, 00:12:45.798 "write": true, 00:12:45.798 "unmap": true, 00:12:45.798 "flush": true, 00:12:45.798 "reset": true, 00:12:45.798 "nvme_admin": false, 00:12:45.798 "nvme_io": false, 00:12:45.798 "nvme_io_md": false, 00:12:45.798 "write_zeroes": true, 00:12:45.798 "zcopy": true, 00:12:45.798 "get_zone_info": false, 00:12:45.798 "zone_management": false, 00:12:45.798 "zone_append": false, 00:12:45.798 "compare": false, 00:12:45.798 "compare_and_write": false, 00:12:45.798 "abort": true, 00:12:45.798 "seek_hole": false, 00:12:45.798 "seek_data": false, 00:12:45.798 "copy": true, 00:12:45.798 "nvme_iov_md": false 00:12:45.798 }, 00:12:45.798 "memory_domains": [ 00:12:45.798 { 00:12:45.798 "dma_device_id": "system", 00:12:45.798 "dma_device_type": 1 00:12:45.798 }, 00:12:45.798 { 00:12:45.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.798 "dma_device_type": 2 00:12:45.798 } 00:12:45.798 ], 00:12:45.798 "driver_specific": {} 00:12:45.798 } 00:12:45.798 ] 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.798 "name": "Existed_Raid", 00:12:45.798 "uuid": "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0", 00:12:45.798 "strip_size_kb": 64, 00:12:45.798 "state": "online", 00:12:45.798 "raid_level": "concat", 00:12:45.798 "superblock": true, 00:12:45.798 "num_base_bdevs": 2, 00:12:45.798 "num_base_bdevs_discovered": 2, 00:12:45.798 "num_base_bdevs_operational": 2, 00:12:45.798 "base_bdevs_list": [ 00:12:45.798 { 00:12:45.798 "name": "BaseBdev1", 00:12:45.798 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:45.798 "is_configured": true, 00:12:45.798 "data_offset": 2048, 00:12:45.798 "data_size": 63488 00:12:45.798 }, 00:12:45.798 { 00:12:45.798 "name": "BaseBdev2", 00:12:45.798 "uuid": "53ac6a56-3519-4841-b17d-02bbe88311b6", 00:12:45.798 "is_configured": true, 00:12:45.798 "data_offset": 2048, 00:12:45.798 "data_size": 63488 00:12:45.798 } 00:12:45.798 ] 00:12:45.798 }' 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.798 15:49:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:46.366 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:46.627 [2024-07-12 15:49:06.918767] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:46.627 "name": "Existed_Raid", 00:12:46.627 "aliases": [ 00:12:46.627 "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0" 00:12:46.627 ], 00:12:46.627 "product_name": "Raid Volume", 00:12:46.627 "block_size": 512, 00:12:46.627 "num_blocks": 126976, 00:12:46.627 "uuid": "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0", 00:12:46.627 "assigned_rate_limits": { 00:12:46.627 "rw_ios_per_sec": 0, 00:12:46.627 "rw_mbytes_per_sec": 0, 00:12:46.627 "r_mbytes_per_sec": 0, 00:12:46.627 "w_mbytes_per_sec": 0 00:12:46.627 }, 00:12:46.627 "claimed": false, 00:12:46.627 "zoned": false, 00:12:46.627 "supported_io_types": { 00:12:46.627 "read": true, 00:12:46.627 "write": true, 00:12:46.627 "unmap": true, 00:12:46.627 "flush": true, 00:12:46.627 "reset": true, 00:12:46.627 "nvme_admin": false, 00:12:46.627 "nvme_io": false, 00:12:46.627 "nvme_io_md": false, 00:12:46.627 "write_zeroes": true, 00:12:46.627 "zcopy": false, 00:12:46.627 "get_zone_info": false, 00:12:46.627 "zone_management": false, 00:12:46.627 "zone_append": false, 00:12:46.627 "compare": false, 00:12:46.627 "compare_and_write": false, 00:12:46.627 "abort": false, 00:12:46.627 "seek_hole": false, 00:12:46.627 "seek_data": false, 00:12:46.627 "copy": false, 00:12:46.627 "nvme_iov_md": false 00:12:46.627 }, 00:12:46.627 "memory_domains": [ 00:12:46.627 { 00:12:46.627 "dma_device_id": "system", 00:12:46.627 "dma_device_type": 1 00:12:46.627 }, 00:12:46.627 { 00:12:46.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.627 "dma_device_type": 2 00:12:46.627 }, 00:12:46.627 { 00:12:46.627 "dma_device_id": "system", 00:12:46.627 "dma_device_type": 1 00:12:46.627 }, 00:12:46.627 { 00:12:46.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.627 "dma_device_type": 2 00:12:46.627 } 00:12:46.627 ], 00:12:46.627 "driver_specific": { 00:12:46.627 "raid": { 00:12:46.627 "uuid": "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0", 00:12:46.627 "strip_size_kb": 64, 00:12:46.627 "state": "online", 00:12:46.627 "raid_level": "concat", 00:12:46.627 "superblock": true, 00:12:46.627 "num_base_bdevs": 2, 00:12:46.627 "num_base_bdevs_discovered": 2, 00:12:46.627 "num_base_bdevs_operational": 2, 00:12:46.627 "base_bdevs_list": [ 00:12:46.627 { 00:12:46.627 "name": "BaseBdev1", 00:12:46.627 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:46.627 "is_configured": true, 00:12:46.627 "data_offset": 2048, 00:12:46.627 "data_size": 63488 00:12:46.627 }, 00:12:46.627 { 00:12:46.627 "name": "BaseBdev2", 00:12:46.627 "uuid": "53ac6a56-3519-4841-b17d-02bbe88311b6", 00:12:46.627 "is_configured": true, 00:12:46.627 "data_offset": 2048, 00:12:46.627 "data_size": 63488 00:12:46.627 } 00:12:46.627 ] 00:12:46.627 } 00:12:46.627 } 00:12:46.627 }' 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:46.627 BaseBdev2' 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:46.627 15:49:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:46.888 "name": "BaseBdev1", 00:12:46.888 "aliases": [ 00:12:46.888 "be7ffd0f-3c20-4c38-a008-58712734c92b" 00:12:46.888 ], 00:12:46.888 "product_name": "Malloc disk", 00:12:46.888 "block_size": 512, 00:12:46.888 "num_blocks": 65536, 00:12:46.888 "uuid": "be7ffd0f-3c20-4c38-a008-58712734c92b", 00:12:46.888 "assigned_rate_limits": { 00:12:46.888 "rw_ios_per_sec": 0, 00:12:46.888 "rw_mbytes_per_sec": 0, 00:12:46.888 "r_mbytes_per_sec": 0, 00:12:46.888 "w_mbytes_per_sec": 0 00:12:46.888 }, 00:12:46.888 "claimed": true, 00:12:46.888 "claim_type": "exclusive_write", 00:12:46.888 "zoned": false, 00:12:46.888 "supported_io_types": { 00:12:46.888 "read": true, 00:12:46.888 "write": true, 00:12:46.888 "unmap": true, 00:12:46.888 "flush": true, 00:12:46.888 "reset": true, 00:12:46.888 "nvme_admin": false, 00:12:46.888 "nvme_io": false, 00:12:46.888 "nvme_io_md": false, 00:12:46.888 "write_zeroes": true, 00:12:46.888 "zcopy": true, 00:12:46.888 "get_zone_info": false, 00:12:46.888 "zone_management": false, 00:12:46.888 "zone_append": false, 00:12:46.888 "compare": false, 00:12:46.888 "compare_and_write": false, 00:12:46.888 "abort": true, 00:12:46.888 "seek_hole": false, 00:12:46.888 "seek_data": false, 00:12:46.888 "copy": true, 00:12:46.888 "nvme_iov_md": false 00:12:46.888 }, 00:12:46.888 "memory_domains": [ 00:12:46.888 { 00:12:46.888 "dma_device_id": "system", 00:12:46.888 "dma_device_type": 1 00:12:46.888 }, 00:12:46.888 { 00:12:46.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:46.888 "dma_device_type": 2 00:12:46.888 } 00:12:46.888 ], 00:12:46.888 "driver_specific": {} 00:12:46.888 }' 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:46.888 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:47.148 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:47.408 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:47.408 "name": "BaseBdev2", 00:12:47.408 "aliases": [ 00:12:47.408 "53ac6a56-3519-4841-b17d-02bbe88311b6" 00:12:47.408 ], 00:12:47.408 "product_name": "Malloc disk", 00:12:47.408 "block_size": 512, 00:12:47.408 "num_blocks": 65536, 00:12:47.408 "uuid": "53ac6a56-3519-4841-b17d-02bbe88311b6", 00:12:47.408 "assigned_rate_limits": { 00:12:47.408 "rw_ios_per_sec": 0, 00:12:47.408 "rw_mbytes_per_sec": 0, 00:12:47.408 "r_mbytes_per_sec": 0, 00:12:47.408 "w_mbytes_per_sec": 0 00:12:47.408 }, 00:12:47.408 "claimed": true, 00:12:47.408 "claim_type": "exclusive_write", 00:12:47.408 "zoned": false, 00:12:47.408 "supported_io_types": { 00:12:47.408 "read": true, 00:12:47.408 "write": true, 00:12:47.408 "unmap": true, 00:12:47.408 "flush": true, 00:12:47.408 "reset": true, 00:12:47.408 "nvme_admin": false, 00:12:47.408 "nvme_io": false, 00:12:47.408 "nvme_io_md": false, 00:12:47.408 "write_zeroes": true, 00:12:47.408 "zcopy": true, 00:12:47.408 "get_zone_info": false, 00:12:47.408 "zone_management": false, 00:12:47.408 "zone_append": false, 00:12:47.408 "compare": false, 00:12:47.408 "compare_and_write": false, 00:12:47.408 "abort": true, 00:12:47.408 "seek_hole": false, 00:12:47.408 "seek_data": false, 00:12:47.408 "copy": true, 00:12:47.408 "nvme_iov_md": false 00:12:47.408 }, 00:12:47.408 "memory_domains": [ 00:12:47.408 { 00:12:47.408 "dma_device_id": "system", 00:12:47.408 "dma_device_type": 1 00:12:47.408 }, 00:12:47.408 { 00:12:47.408 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.408 "dma_device_type": 2 00:12:47.408 } 00:12:47.408 ], 00:12:47.408 "driver_specific": {} 00:12:47.408 }' 00:12:47.408 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.408 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:47.408 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:47.408 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.668 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:47.668 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:47.668 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.668 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:47.668 15:49:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:47.668 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.668 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:47.668 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:47.668 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:47.927 [2024-07-12 15:49:08.249968] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:47.927 [2024-07-12 15:49:08.249985] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:47.927 [2024-07-12 15:49:08.250020] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.927 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.928 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.928 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.928 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.187 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.187 "name": "Existed_Raid", 00:12:48.187 "uuid": "925b3b2e-8221-45a5-8a4c-a5c5b343c5c0", 00:12:48.187 "strip_size_kb": 64, 00:12:48.188 "state": "offline", 00:12:48.188 "raid_level": "concat", 00:12:48.188 "superblock": true, 00:12:48.188 "num_base_bdevs": 2, 00:12:48.188 "num_base_bdevs_discovered": 1, 00:12:48.188 "num_base_bdevs_operational": 1, 00:12:48.188 "base_bdevs_list": [ 00:12:48.188 { 00:12:48.188 "name": null, 00:12:48.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.188 "is_configured": false, 00:12:48.188 "data_offset": 2048, 00:12:48.188 "data_size": 63488 00:12:48.188 }, 00:12:48.188 { 00:12:48.188 "name": "BaseBdev2", 00:12:48.188 "uuid": "53ac6a56-3519-4841-b17d-02bbe88311b6", 00:12:48.188 "is_configured": true, 00:12:48.188 "data_offset": 2048, 00:12:48.188 "data_size": 63488 00:12:48.188 } 00:12:48.188 ] 00:12:48.188 }' 00:12:48.188 15:49:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.188 15:49:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.760 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:48.760 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:48.760 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.760 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:49.018 [2024-07-12 15:49:09.368810] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:49.018 [2024-07-12 15:49:09.368843] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18abe80 name Existed_Raid, state offline 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.018 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2507161 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2507161 ']' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2507161 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2507161 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2507161' 00:12:49.279 killing process with pid 2507161 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2507161 00:12:49.279 [2024-07-12 15:49:09.633044] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:49.279 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2507161 00:12:49.279 [2024-07-12 15:49:09.633634] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:49.540 15:49:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:49.540 00:12:49.540 real 0m8.924s 00:12:49.540 user 0m16.191s 00:12:49.540 sys 0m1.367s 00:12:49.540 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:49.540 15:49:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:49.540 ************************************ 00:12:49.540 END TEST raid_state_function_test_sb 00:12:49.540 ************************************ 00:12:49.540 15:49:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:49.540 15:49:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:49.540 15:49:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:49.540 15:49:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:49.540 15:49:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:49.540 ************************************ 00:12:49.540 START TEST raid_superblock_test 00:12:49.540 ************************************ 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2508902 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2508902 /var/tmp/spdk-raid.sock 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2508902 ']' 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:49.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:49.540 15:49:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:49.540 [2024-07-12 15:49:09.883459] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:49.540 [2024-07-12 15:49:09.883502] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2508902 ] 00:12:49.540 [2024-07-12 15:49:09.970859] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:49.800 [2024-07-12 15:49:10.035279] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:49.800 [2024-07-12 15:49:10.075611] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:49.800 [2024-07-12 15:49:10.075634] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:50.371 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:50.631 malloc1 00:12:50.631 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:50.892 [2024-07-12 15:49:11.091137] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:50.892 [2024-07-12 15:49:11.091171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:50.892 [2024-07-12 15:49:11.091187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb15b50 00:12:50.892 [2024-07-12 15:49:11.091194] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:50.892 [2024-07-12 15:49:11.092485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:50.892 [2024-07-12 15:49:11.092505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:50.892 pt1 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:50.892 malloc2 00:12:50.892 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:51.152 [2024-07-12 15:49:11.478092] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:51.152 [2024-07-12 15:49:11.478119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:51.152 [2024-07-12 15:49:11.478129] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb16df0 00:12:51.152 [2024-07-12 15:49:11.478135] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:51.152 [2024-07-12 15:49:11.479287] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:51.152 [2024-07-12 15:49:11.479305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:51.152 pt2 00:12:51.152 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:51.152 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:51.152 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:51.412 [2024-07-12 15:49:11.686631] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:51.412 [2024-07-12 15:49:11.687625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:51.412 [2024-07-12 15:49:11.687745] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcbc0f0 00:12:51.412 [2024-07-12 15:49:11.687753] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:51.412 [2024-07-12 15:49:11.687898] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2ca40 00:12:51.412 [2024-07-12 15:49:11.688001] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcbc0f0 00:12:51.412 [2024-07-12 15:49:11.688007] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcbc0f0 00:12:51.412 [2024-07-12 15:49:11.688076] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.412 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.673 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.673 "name": "raid_bdev1", 00:12:51.673 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:51.673 "strip_size_kb": 64, 00:12:51.673 "state": "online", 00:12:51.673 "raid_level": "concat", 00:12:51.673 "superblock": true, 00:12:51.673 "num_base_bdevs": 2, 00:12:51.673 "num_base_bdevs_discovered": 2, 00:12:51.673 "num_base_bdevs_operational": 2, 00:12:51.673 "base_bdevs_list": [ 00:12:51.673 { 00:12:51.673 "name": "pt1", 00:12:51.673 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:51.673 "is_configured": true, 00:12:51.673 "data_offset": 2048, 00:12:51.673 "data_size": 63488 00:12:51.673 }, 00:12:51.673 { 00:12:51.673 "name": "pt2", 00:12:51.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:51.673 "is_configured": true, 00:12:51.673 "data_offset": 2048, 00:12:51.673 "data_size": 63488 00:12:51.673 } 00:12:51.673 ] 00:12:51.673 }' 00:12:51.673 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.673 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:52.245 [2024-07-12 15:49:12.569056] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:52.245 "name": "raid_bdev1", 00:12:52.245 "aliases": [ 00:12:52.245 "c606399e-5763-4e16-b5b0-9c24bb1c7698" 00:12:52.245 ], 00:12:52.245 "product_name": "Raid Volume", 00:12:52.245 "block_size": 512, 00:12:52.245 "num_blocks": 126976, 00:12:52.245 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:52.245 "assigned_rate_limits": { 00:12:52.245 "rw_ios_per_sec": 0, 00:12:52.245 "rw_mbytes_per_sec": 0, 00:12:52.245 "r_mbytes_per_sec": 0, 00:12:52.245 "w_mbytes_per_sec": 0 00:12:52.245 }, 00:12:52.245 "claimed": false, 00:12:52.245 "zoned": false, 00:12:52.245 "supported_io_types": { 00:12:52.245 "read": true, 00:12:52.245 "write": true, 00:12:52.245 "unmap": true, 00:12:52.245 "flush": true, 00:12:52.245 "reset": true, 00:12:52.245 "nvme_admin": false, 00:12:52.245 "nvme_io": false, 00:12:52.245 "nvme_io_md": false, 00:12:52.245 "write_zeroes": true, 00:12:52.245 "zcopy": false, 00:12:52.245 "get_zone_info": false, 00:12:52.245 "zone_management": false, 00:12:52.245 "zone_append": false, 00:12:52.245 "compare": false, 00:12:52.245 "compare_and_write": false, 00:12:52.245 "abort": false, 00:12:52.245 "seek_hole": false, 00:12:52.245 "seek_data": false, 00:12:52.245 "copy": false, 00:12:52.245 "nvme_iov_md": false 00:12:52.245 }, 00:12:52.245 "memory_domains": [ 00:12:52.245 { 00:12:52.245 "dma_device_id": "system", 00:12:52.245 "dma_device_type": 1 00:12:52.245 }, 00:12:52.245 { 00:12:52.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.245 "dma_device_type": 2 00:12:52.245 }, 00:12:52.245 { 00:12:52.245 "dma_device_id": "system", 00:12:52.245 "dma_device_type": 1 00:12:52.245 }, 00:12:52.245 { 00:12:52.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.245 "dma_device_type": 2 00:12:52.245 } 00:12:52.245 ], 00:12:52.245 "driver_specific": { 00:12:52.245 "raid": { 00:12:52.245 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:52.245 "strip_size_kb": 64, 00:12:52.245 "state": "online", 00:12:52.245 "raid_level": "concat", 00:12:52.245 "superblock": true, 00:12:52.245 "num_base_bdevs": 2, 00:12:52.245 "num_base_bdevs_discovered": 2, 00:12:52.245 "num_base_bdevs_operational": 2, 00:12:52.245 "base_bdevs_list": [ 00:12:52.245 { 00:12:52.245 "name": "pt1", 00:12:52.245 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:52.245 "is_configured": true, 00:12:52.245 "data_offset": 2048, 00:12:52.245 "data_size": 63488 00:12:52.245 }, 00:12:52.245 { 00:12:52.245 "name": "pt2", 00:12:52.245 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:52.245 "is_configured": true, 00:12:52.245 "data_offset": 2048, 00:12:52.245 "data_size": 63488 00:12:52.245 } 00:12:52.245 ] 00:12:52.245 } 00:12:52.245 } 00:12:52.245 }' 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:52.245 pt2' 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:52.245 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:52.506 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:52.506 "name": "pt1", 00:12:52.506 "aliases": [ 00:12:52.506 "00000000-0000-0000-0000-000000000001" 00:12:52.506 ], 00:12:52.506 "product_name": "passthru", 00:12:52.506 "block_size": 512, 00:12:52.506 "num_blocks": 65536, 00:12:52.506 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:52.506 "assigned_rate_limits": { 00:12:52.506 "rw_ios_per_sec": 0, 00:12:52.506 "rw_mbytes_per_sec": 0, 00:12:52.506 "r_mbytes_per_sec": 0, 00:12:52.506 "w_mbytes_per_sec": 0 00:12:52.506 }, 00:12:52.506 "claimed": true, 00:12:52.506 "claim_type": "exclusive_write", 00:12:52.506 "zoned": false, 00:12:52.506 "supported_io_types": { 00:12:52.506 "read": true, 00:12:52.506 "write": true, 00:12:52.506 "unmap": true, 00:12:52.506 "flush": true, 00:12:52.506 "reset": true, 00:12:52.506 "nvme_admin": false, 00:12:52.506 "nvme_io": false, 00:12:52.506 "nvme_io_md": false, 00:12:52.506 "write_zeroes": true, 00:12:52.506 "zcopy": true, 00:12:52.506 "get_zone_info": false, 00:12:52.506 "zone_management": false, 00:12:52.506 "zone_append": false, 00:12:52.507 "compare": false, 00:12:52.507 "compare_and_write": false, 00:12:52.507 "abort": true, 00:12:52.507 "seek_hole": false, 00:12:52.507 "seek_data": false, 00:12:52.507 "copy": true, 00:12:52.507 "nvme_iov_md": false 00:12:52.507 }, 00:12:52.507 "memory_domains": [ 00:12:52.507 { 00:12:52.507 "dma_device_id": "system", 00:12:52.507 "dma_device_type": 1 00:12:52.507 }, 00:12:52.507 { 00:12:52.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.507 "dma_device_type": 2 00:12:52.507 } 00:12:52.507 ], 00:12:52.507 "driver_specific": { 00:12:52.507 "passthru": { 00:12:52.507 "name": "pt1", 00:12:52.507 "base_bdev_name": "malloc1" 00:12:52.507 } 00:12:52.507 } 00:12:52.507 }' 00:12:52.507 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.507 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.507 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.507 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.767 15:49:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:52.767 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:53.028 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:53.028 "name": "pt2", 00:12:53.028 "aliases": [ 00:12:53.028 "00000000-0000-0000-0000-000000000002" 00:12:53.028 ], 00:12:53.028 "product_name": "passthru", 00:12:53.028 "block_size": 512, 00:12:53.028 "num_blocks": 65536, 00:12:53.028 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:53.028 "assigned_rate_limits": { 00:12:53.028 "rw_ios_per_sec": 0, 00:12:53.028 "rw_mbytes_per_sec": 0, 00:12:53.028 "r_mbytes_per_sec": 0, 00:12:53.028 "w_mbytes_per_sec": 0 00:12:53.028 }, 00:12:53.028 "claimed": true, 00:12:53.028 "claim_type": "exclusive_write", 00:12:53.028 "zoned": false, 00:12:53.028 "supported_io_types": { 00:12:53.028 "read": true, 00:12:53.028 "write": true, 00:12:53.028 "unmap": true, 00:12:53.028 "flush": true, 00:12:53.028 "reset": true, 00:12:53.028 "nvme_admin": false, 00:12:53.028 "nvme_io": false, 00:12:53.028 "nvme_io_md": false, 00:12:53.028 "write_zeroes": true, 00:12:53.028 "zcopy": true, 00:12:53.028 "get_zone_info": false, 00:12:53.028 "zone_management": false, 00:12:53.028 "zone_append": false, 00:12:53.028 "compare": false, 00:12:53.028 "compare_and_write": false, 00:12:53.028 "abort": true, 00:12:53.028 "seek_hole": false, 00:12:53.028 "seek_data": false, 00:12:53.028 "copy": true, 00:12:53.028 "nvme_iov_md": false 00:12:53.028 }, 00:12:53.028 "memory_domains": [ 00:12:53.028 { 00:12:53.028 "dma_device_id": "system", 00:12:53.028 "dma_device_type": 1 00:12:53.028 }, 00:12:53.028 { 00:12:53.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:53.028 "dma_device_type": 2 00:12:53.029 } 00:12:53.029 ], 00:12:53.029 "driver_specific": { 00:12:53.029 "passthru": { 00:12:53.029 "name": "pt2", 00:12:53.029 "base_bdev_name": "malloc2" 00:12:53.029 } 00:12:53.029 } 00:12:53.029 }' 00:12:53.029 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.029 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.290 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:53.551 [2024-07-12 15:49:13.924462] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c606399e-5763-4e16-b5b0-9c24bb1c7698 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c606399e-5763-4e16-b5b0-9c24bb1c7698 ']' 00:12:53.551 15:49:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:53.811 [2024-07-12 15:49:14.088671] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:53.811 [2024-07-12 15:49:14.088682] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:53.811 [2024-07-12 15:49:14.088726] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:53.811 [2024-07-12 15:49:14.088758] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:53.811 [2024-07-12 15:49:14.088764] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbc0f0 name raid_bdev1, state offline 00:12:53.811 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.811 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:54.071 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:54.331 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:54.331 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:54.592 [2024-07-12 15:49:14.970876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:54.592 [2024-07-12 15:49:14.971918] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:54.592 [2024-07-12 15:49:14.971957] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:54.592 [2024-07-12 15:49:14.971984] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:54.592 [2024-07-12 15:49:14.971994] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:54.592 [2024-07-12 15:49:14.971999] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcbae00 name raid_bdev1, state configuring 00:12:54.592 request: 00:12:54.592 { 00:12:54.592 "name": "raid_bdev1", 00:12:54.592 "raid_level": "concat", 00:12:54.592 "base_bdevs": [ 00:12:54.592 "malloc1", 00:12:54.592 "malloc2" 00:12:54.592 ], 00:12:54.592 "strip_size_kb": 64, 00:12:54.592 "superblock": false, 00:12:54.592 "method": "bdev_raid_create", 00:12:54.592 "req_id": 1 00:12:54.592 } 00:12:54.592 Got JSON-RPC error response 00:12:54.592 response: 00:12:54.592 { 00:12:54.592 "code": -17, 00:12:54.592 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:54.592 } 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.592 15:49:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:54.853 [2024-07-12 15:49:15.275600] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:54.853 [2024-07-12 15:49:15.275620] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.853 [2024-07-12 15:49:15.275630] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcbb920 00:12:54.853 [2024-07-12 15:49:15.275636] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.853 [2024-07-12 15:49:15.276820] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.853 [2024-07-12 15:49:15.276838] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:54.853 [2024-07-12 15:49:15.276879] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:54.853 [2024-07-12 15:49:15.276895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:54.853 pt1 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.853 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:55.115 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.115 "name": "raid_bdev1", 00:12:55.115 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:55.115 "strip_size_kb": 64, 00:12:55.115 "state": "configuring", 00:12:55.115 "raid_level": "concat", 00:12:55.115 "superblock": true, 00:12:55.115 "num_base_bdevs": 2, 00:12:55.115 "num_base_bdevs_discovered": 1, 00:12:55.115 "num_base_bdevs_operational": 2, 00:12:55.115 "base_bdevs_list": [ 00:12:55.115 { 00:12:55.115 "name": "pt1", 00:12:55.115 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:55.115 "is_configured": true, 00:12:55.115 "data_offset": 2048, 00:12:55.115 "data_size": 63488 00:12:55.115 }, 00:12:55.115 { 00:12:55.115 "name": null, 00:12:55.115 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:55.115 "is_configured": false, 00:12:55.115 "data_offset": 2048, 00:12:55.115 "data_size": 63488 00:12:55.115 } 00:12:55.115 ] 00:12:55.115 }' 00:12:55.115 15:49:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.115 15:49:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:55.742 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:55.742 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:55.742 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:55.742 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:56.003 [2024-07-12 15:49:16.202001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:56.003 [2024-07-12 15:49:16.202072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.003 [2024-07-12 15:49:16.202089] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcbd230 00:12:56.003 [2024-07-12 15:49:16.202097] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.003 [2024-07-12 15:49:16.202518] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.003 [2024-07-12 15:49:16.202538] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:56.003 [2024-07-12 15:49:16.202610] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:56.003 [2024-07-12 15:49:16.202628] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:56.003 [2024-07-12 15:49:16.202738] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb14da0 00:12:56.003 [2024-07-12 15:49:16.202746] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:56.003 [2024-07-12 15:49:16.202904] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcc0760 00:12:56.003 [2024-07-12 15:49:16.203017] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb14da0 00:12:56.003 [2024-07-12 15:49:16.203023] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb14da0 00:12:56.003 [2024-07-12 15:49:16.203110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:56.003 pt2 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.003 "name": "raid_bdev1", 00:12:56.003 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:56.003 "strip_size_kb": 64, 00:12:56.003 "state": "online", 00:12:56.003 "raid_level": "concat", 00:12:56.003 "superblock": true, 00:12:56.003 "num_base_bdevs": 2, 00:12:56.003 "num_base_bdevs_discovered": 2, 00:12:56.003 "num_base_bdevs_operational": 2, 00:12:56.003 "base_bdevs_list": [ 00:12:56.003 { 00:12:56.003 "name": "pt1", 00:12:56.003 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:56.003 "is_configured": true, 00:12:56.003 "data_offset": 2048, 00:12:56.003 "data_size": 63488 00:12:56.003 }, 00:12:56.003 { 00:12:56.003 "name": "pt2", 00:12:56.003 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:56.003 "is_configured": true, 00:12:56.003 "data_offset": 2048, 00:12:56.003 "data_size": 63488 00:12:56.003 } 00:12:56.003 ] 00:12:56.003 }' 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.003 15:49:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:56.575 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:56.833 [2024-07-12 15:49:17.196708] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.833 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:56.833 "name": "raid_bdev1", 00:12:56.833 "aliases": [ 00:12:56.833 "c606399e-5763-4e16-b5b0-9c24bb1c7698" 00:12:56.833 ], 00:12:56.833 "product_name": "Raid Volume", 00:12:56.833 "block_size": 512, 00:12:56.833 "num_blocks": 126976, 00:12:56.833 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:56.833 "assigned_rate_limits": { 00:12:56.833 "rw_ios_per_sec": 0, 00:12:56.833 "rw_mbytes_per_sec": 0, 00:12:56.833 "r_mbytes_per_sec": 0, 00:12:56.833 "w_mbytes_per_sec": 0 00:12:56.833 }, 00:12:56.833 "claimed": false, 00:12:56.833 "zoned": false, 00:12:56.833 "supported_io_types": { 00:12:56.833 "read": true, 00:12:56.833 "write": true, 00:12:56.833 "unmap": true, 00:12:56.833 "flush": true, 00:12:56.833 "reset": true, 00:12:56.833 "nvme_admin": false, 00:12:56.833 "nvme_io": false, 00:12:56.833 "nvme_io_md": false, 00:12:56.833 "write_zeroes": true, 00:12:56.833 "zcopy": false, 00:12:56.833 "get_zone_info": false, 00:12:56.833 "zone_management": false, 00:12:56.834 "zone_append": false, 00:12:56.834 "compare": false, 00:12:56.834 "compare_and_write": false, 00:12:56.834 "abort": false, 00:12:56.834 "seek_hole": false, 00:12:56.834 "seek_data": false, 00:12:56.834 "copy": false, 00:12:56.834 "nvme_iov_md": false 00:12:56.834 }, 00:12:56.834 "memory_domains": [ 00:12:56.834 { 00:12:56.834 "dma_device_id": "system", 00:12:56.834 "dma_device_type": 1 00:12:56.834 }, 00:12:56.834 { 00:12:56.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.834 "dma_device_type": 2 00:12:56.834 }, 00:12:56.834 { 00:12:56.834 "dma_device_id": "system", 00:12:56.834 "dma_device_type": 1 00:12:56.834 }, 00:12:56.834 { 00:12:56.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.834 "dma_device_type": 2 00:12:56.834 } 00:12:56.834 ], 00:12:56.834 "driver_specific": { 00:12:56.834 "raid": { 00:12:56.834 "uuid": "c606399e-5763-4e16-b5b0-9c24bb1c7698", 00:12:56.834 "strip_size_kb": 64, 00:12:56.834 "state": "online", 00:12:56.834 "raid_level": "concat", 00:12:56.834 "superblock": true, 00:12:56.834 "num_base_bdevs": 2, 00:12:56.834 "num_base_bdevs_discovered": 2, 00:12:56.834 "num_base_bdevs_operational": 2, 00:12:56.834 "base_bdevs_list": [ 00:12:56.834 { 00:12:56.834 "name": "pt1", 00:12:56.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:56.834 "is_configured": true, 00:12:56.834 "data_offset": 2048, 00:12:56.834 "data_size": 63488 00:12:56.834 }, 00:12:56.834 { 00:12:56.834 "name": "pt2", 00:12:56.834 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:56.834 "is_configured": true, 00:12:56.834 "data_offset": 2048, 00:12:56.834 "data_size": 63488 00:12:56.834 } 00:12:56.834 ] 00:12:56.834 } 00:12:56.834 } 00:12:56.834 }' 00:12:56.834 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:56.834 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:56.834 pt2' 00:12:56.834 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.834 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:56.834 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.094 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.094 "name": "pt1", 00:12:57.094 "aliases": [ 00:12:57.094 "00000000-0000-0000-0000-000000000001" 00:12:57.094 ], 00:12:57.094 "product_name": "passthru", 00:12:57.094 "block_size": 512, 00:12:57.094 "num_blocks": 65536, 00:12:57.094 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:57.094 "assigned_rate_limits": { 00:12:57.094 "rw_ios_per_sec": 0, 00:12:57.094 "rw_mbytes_per_sec": 0, 00:12:57.094 "r_mbytes_per_sec": 0, 00:12:57.094 "w_mbytes_per_sec": 0 00:12:57.094 }, 00:12:57.094 "claimed": true, 00:12:57.094 "claim_type": "exclusive_write", 00:12:57.094 "zoned": false, 00:12:57.094 "supported_io_types": { 00:12:57.094 "read": true, 00:12:57.094 "write": true, 00:12:57.094 "unmap": true, 00:12:57.094 "flush": true, 00:12:57.094 "reset": true, 00:12:57.094 "nvme_admin": false, 00:12:57.094 "nvme_io": false, 00:12:57.094 "nvme_io_md": false, 00:12:57.094 "write_zeroes": true, 00:12:57.094 "zcopy": true, 00:12:57.094 "get_zone_info": false, 00:12:57.094 "zone_management": false, 00:12:57.094 "zone_append": false, 00:12:57.094 "compare": false, 00:12:57.094 "compare_and_write": false, 00:12:57.094 "abort": true, 00:12:57.094 "seek_hole": false, 00:12:57.094 "seek_data": false, 00:12:57.094 "copy": true, 00:12:57.094 "nvme_iov_md": false 00:12:57.094 }, 00:12:57.094 "memory_domains": [ 00:12:57.094 { 00:12:57.094 "dma_device_id": "system", 00:12:57.094 "dma_device_type": 1 00:12:57.094 }, 00:12:57.094 { 00:12:57.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.094 "dma_device_type": 2 00:12:57.094 } 00:12:57.094 ], 00:12:57.094 "driver_specific": { 00:12:57.094 "passthru": { 00:12:57.094 "name": "pt1", 00:12:57.094 "base_bdev_name": "malloc1" 00:12:57.094 } 00:12:57.094 } 00:12:57.094 }' 00:12:57.094 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.094 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.354 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.614 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.614 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:57.614 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:57.614 15:49:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.614 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.614 "name": "pt2", 00:12:57.614 "aliases": [ 00:12:57.614 "00000000-0000-0000-0000-000000000002" 00:12:57.614 ], 00:12:57.614 "product_name": "passthru", 00:12:57.614 "block_size": 512, 00:12:57.614 "num_blocks": 65536, 00:12:57.614 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:57.614 "assigned_rate_limits": { 00:12:57.614 "rw_ios_per_sec": 0, 00:12:57.614 "rw_mbytes_per_sec": 0, 00:12:57.614 "r_mbytes_per_sec": 0, 00:12:57.614 "w_mbytes_per_sec": 0 00:12:57.614 }, 00:12:57.614 "claimed": true, 00:12:57.614 "claim_type": "exclusive_write", 00:12:57.614 "zoned": false, 00:12:57.614 "supported_io_types": { 00:12:57.614 "read": true, 00:12:57.614 "write": true, 00:12:57.614 "unmap": true, 00:12:57.614 "flush": true, 00:12:57.614 "reset": true, 00:12:57.614 "nvme_admin": false, 00:12:57.614 "nvme_io": false, 00:12:57.614 "nvme_io_md": false, 00:12:57.614 "write_zeroes": true, 00:12:57.614 "zcopy": true, 00:12:57.614 "get_zone_info": false, 00:12:57.614 "zone_management": false, 00:12:57.614 "zone_append": false, 00:12:57.614 "compare": false, 00:12:57.614 "compare_and_write": false, 00:12:57.614 "abort": true, 00:12:57.614 "seek_hole": false, 00:12:57.614 "seek_data": false, 00:12:57.614 "copy": true, 00:12:57.614 "nvme_iov_md": false 00:12:57.614 }, 00:12:57.614 "memory_domains": [ 00:12:57.614 { 00:12:57.614 "dma_device_id": "system", 00:12:57.614 "dma_device_type": 1 00:12:57.614 }, 00:12:57.614 { 00:12:57.614 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.614 "dma_device_type": 2 00:12:57.614 } 00:12:57.614 ], 00:12:57.614 "driver_specific": { 00:12:57.614 "passthru": { 00:12:57.614 "name": "pt2", 00:12:57.614 "base_bdev_name": "malloc2" 00:12:57.614 } 00:12:57.614 } 00:12:57.614 }' 00:12:57.614 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.874 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.134 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.134 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.134 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:58.134 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:58.134 [2024-07-12 15:49:18.576192] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c606399e-5763-4e16-b5b0-9c24bb1c7698 '!=' c606399e-5763-4e16-b5b0-9c24bb1c7698 ']' 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2508902 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2508902 ']' 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2508902 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2508902 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2508902' 00:12:58.395 killing process with pid 2508902 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2508902 00:12:58.395 [2024-07-12 15:49:18.646260] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:58.395 [2024-07-12 15:49:18.646303] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.395 [2024-07-12 15:49:18.646340] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.395 [2024-07-12 15:49:18.646346] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb14da0 name raid_bdev1, state offline 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2508902 00:12:58.395 [2024-07-12 15:49:18.659393] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:58.395 00:12:58.395 real 0m8.986s 00:12:58.395 user 0m16.406s 00:12:58.395 sys 0m1.321s 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:58.395 15:49:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.395 ************************************ 00:12:58.395 END TEST raid_superblock_test 00:12:58.395 ************************************ 00:12:58.657 15:49:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:58.657 15:49:18 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:58.657 15:49:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:58.657 15:49:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:58.657 15:49:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:58.657 ************************************ 00:12:58.657 START TEST raid_read_error_test 00:12:58.657 ************************************ 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZDNNMxYhPv 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2510659 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2510659 /var/tmp/spdk-raid.sock 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2510659 ']' 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:58.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:58.657 15:49:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.657 [2024-07-12 15:49:18.952860] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:12:58.657 [2024-07-12 15:49:18.952913] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2510659 ] 00:12:58.657 [2024-07-12 15:49:19.045640] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:58.918 [2024-07-12 15:49:19.121642] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:58.918 [2024-07-12 15:49:19.164832] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:58.918 [2024-07-12 15:49:19.164856] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.488 15:49:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.488 15:49:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:59.488 15:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:59.488 15:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:59.748 BaseBdev1_malloc 00:12:59.748 15:49:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:59.748 true 00:12:59.748 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:00.009 [2024-07-12 15:49:20.364154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:00.009 [2024-07-12 15:49:20.364188] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:00.009 [2024-07-12 15:49:20.364198] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f0aa0 00:13:00.009 [2024-07-12 15:49:20.364205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:00.009 [2024-07-12 15:49:20.365440] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:00.009 [2024-07-12 15:49:20.365459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:00.009 BaseBdev1 00:13:00.009 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:00.009 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:00.270 BaseBdev2_malloc 00:13:00.270 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:00.531 true 00:13:00.531 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:00.531 [2024-07-12 15:49:20.923037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:00.531 [2024-07-12 15:49:20.923064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:00.531 [2024-07-12 15:49:20.923075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f5e40 00:13:00.531 [2024-07-12 15:49:20.923081] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:00.531 [2024-07-12 15:49:20.924219] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:00.531 [2024-07-12 15:49:20.924236] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:00.531 BaseBdev2 00:13:00.531 15:49:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:00.791 [2024-07-12 15:49:21.103515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:00.791 [2024-07-12 15:49:21.104489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:00.791 [2024-07-12 15:49:21.104625] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f7000 00:13:00.791 [2024-07-12 15:49:21.104634] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:00.791 [2024-07-12 15:49:21.104774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f7ce0 00:13:00.791 [2024-07-12 15:49:21.104887] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f7000 00:13:00.791 [2024-07-12 15:49:21.104892] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f7000 00:13:00.792 [2024-07-12 15:49:21.104969] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.792 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.051 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.051 "name": "raid_bdev1", 00:13:01.051 "uuid": "54b79a4d-6bd1-4c13-b46e-74a217ed7a77", 00:13:01.051 "strip_size_kb": 64, 00:13:01.051 "state": "online", 00:13:01.051 "raid_level": "concat", 00:13:01.051 "superblock": true, 00:13:01.051 "num_base_bdevs": 2, 00:13:01.051 "num_base_bdevs_discovered": 2, 00:13:01.051 "num_base_bdevs_operational": 2, 00:13:01.051 "base_bdevs_list": [ 00:13:01.051 { 00:13:01.051 "name": "BaseBdev1", 00:13:01.051 "uuid": "271cb056-0c71-5e85-b571-5983aee56cd8", 00:13:01.051 "is_configured": true, 00:13:01.051 "data_offset": 2048, 00:13:01.051 "data_size": 63488 00:13:01.051 }, 00:13:01.051 { 00:13:01.051 "name": "BaseBdev2", 00:13:01.051 "uuid": "2f457eb7-cf8a-54ee-a0d5-6f20934931b8", 00:13:01.051 "is_configured": true, 00:13:01.051 "data_offset": 2048, 00:13:01.051 "data_size": 63488 00:13:01.051 } 00:13:01.051 ] 00:13:01.051 }' 00:13:01.051 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.051 15:49:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.619 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:01.619 15:49:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:01.619 [2024-07-12 15:49:21.961897] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f24d0 00:13:02.558 15:49:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.818 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:03.077 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.077 "name": "raid_bdev1", 00:13:03.077 "uuid": "54b79a4d-6bd1-4c13-b46e-74a217ed7a77", 00:13:03.077 "strip_size_kb": 64, 00:13:03.077 "state": "online", 00:13:03.077 "raid_level": "concat", 00:13:03.077 "superblock": true, 00:13:03.077 "num_base_bdevs": 2, 00:13:03.077 "num_base_bdevs_discovered": 2, 00:13:03.077 "num_base_bdevs_operational": 2, 00:13:03.077 "base_bdevs_list": [ 00:13:03.077 { 00:13:03.077 "name": "BaseBdev1", 00:13:03.077 "uuid": "271cb056-0c71-5e85-b571-5983aee56cd8", 00:13:03.077 "is_configured": true, 00:13:03.077 "data_offset": 2048, 00:13:03.077 "data_size": 63488 00:13:03.077 }, 00:13:03.077 { 00:13:03.077 "name": "BaseBdev2", 00:13:03.077 "uuid": "2f457eb7-cf8a-54ee-a0d5-6f20934931b8", 00:13:03.077 "is_configured": true, 00:13:03.077 "data_offset": 2048, 00:13:03.077 "data_size": 63488 00:13:03.077 } 00:13:03.077 ] 00:13:03.077 }' 00:13:03.077 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.077 15:49:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.647 15:49:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:03.647 [2024-07-12 15:49:24.007654] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:03.647 [2024-07-12 15:49:24.007688] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:03.647 [2024-07-12 15:49:24.010275] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:03.647 [2024-07-12 15:49:24.010297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:03.647 [2024-07-12 15:49:24.010317] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:03.647 [2024-07-12 15:49:24.010323] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f7000 name raid_bdev1, state offline 00:13:03.647 0 00:13:03.647 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2510659 00:13:03.647 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2510659 ']' 00:13:03.647 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2510659 00:13:03.647 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:03.647 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2510659 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2510659' 00:13:03.648 killing process with pid 2510659 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2510659 00:13:03.648 [2024-07-12 15:49:24.076825] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:03.648 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2510659 00:13:03.648 [2024-07-12 15:49:24.082279] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZDNNMxYhPv 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:13:03.908 00:13:03.908 real 0m5.332s 00:13:03.908 user 0m8.393s 00:13:03.908 sys 0m0.779s 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:03.908 15:49:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.908 ************************************ 00:13:03.908 END TEST raid_read_error_test 00:13:03.908 ************************************ 00:13:03.908 15:49:24 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:03.908 15:49:24 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:13:03.908 15:49:24 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:03.908 15:49:24 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:03.908 15:49:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:03.908 ************************************ 00:13:03.908 START TEST raid_write_error_test 00:13:03.908 ************************************ 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.v7Xq68MyRz 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2511668 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2511668 /var/tmp/spdk-raid.sock 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2511668 ']' 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:03.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.908 15:49:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:04.168 [2024-07-12 15:49:24.360434] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:04.169 [2024-07-12 15:49:24.360479] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2511668 ] 00:13:04.169 [2024-07-12 15:49:24.448327] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:04.169 [2024-07-12 15:49:24.510387] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:04.169 [2024-07-12 15:49:24.558206] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:04.169 [2024-07-12 15:49:24.558232] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:05.108 15:49:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:05.108 15:49:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:05.108 15:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:05.108 15:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:05.368 BaseBdev1_malloc 00:13:05.368 15:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:05.627 true 00:13:05.627 15:49:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:06.196 [2024-07-12 15:49:26.443027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:06.196 [2024-07-12 15:49:26.443057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:06.197 [2024-07-12 15:49:26.443069] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20d6aa0 00:13:06.197 [2024-07-12 15:49:26.443076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:06.197 [2024-07-12 15:49:26.444335] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:06.197 [2024-07-12 15:49:26.444355] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:06.197 BaseBdev1 00:13:06.197 15:49:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:06.197 15:49:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:06.767 BaseBdev2_malloc 00:13:06.767 15:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:07.336 true 00:13:07.336 15:49:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:07.905 [2024-07-12 15:49:28.068883] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:07.905 [2024-07-12 15:49:28.068911] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:07.905 [2024-07-12 15:49:28.068923] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20dbe40 00:13:07.905 [2024-07-12 15:49:28.068930] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:07.905 [2024-07-12 15:49:28.070140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:07.905 [2024-07-12 15:49:28.070159] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:07.905 BaseBdev2 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:07.906 [2024-07-12 15:49:28.277433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:07.906 [2024-07-12 15:49:28.278452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:07.906 [2024-07-12 15:49:28.278592] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20dd000 00:13:07.906 [2024-07-12 15:49:28.278600] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:07.906 [2024-07-12 15:49:28.278754] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ddce0 00:13:07.906 [2024-07-12 15:49:28.278868] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20dd000 00:13:07.906 [2024-07-12 15:49:28.278873] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20dd000 00:13:07.906 [2024-07-12 15:49:28.278949] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:07.906 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:08.475 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.475 "name": "raid_bdev1", 00:13:08.475 "uuid": "95270116-aa29-4d20-a81d-caccc66f17c9", 00:13:08.475 "strip_size_kb": 64, 00:13:08.475 "state": "online", 00:13:08.475 "raid_level": "concat", 00:13:08.475 "superblock": true, 00:13:08.475 "num_base_bdevs": 2, 00:13:08.475 "num_base_bdevs_discovered": 2, 00:13:08.475 "num_base_bdevs_operational": 2, 00:13:08.475 "base_bdevs_list": [ 00:13:08.475 { 00:13:08.475 "name": "BaseBdev1", 00:13:08.475 "uuid": "8b1a0147-adc0-58f5-a77c-20708f339066", 00:13:08.475 "is_configured": true, 00:13:08.475 "data_offset": 2048, 00:13:08.475 "data_size": 63488 00:13:08.475 }, 00:13:08.475 { 00:13:08.475 "name": "BaseBdev2", 00:13:08.475 "uuid": "381ab481-620b-5afe-a96f-04a673b84e9c", 00:13:08.475 "is_configured": true, 00:13:08.475 "data_offset": 2048, 00:13:08.475 "data_size": 63488 00:13:08.475 } 00:13:08.475 ] 00:13:08.475 }' 00:13:08.475 15:49:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.475 15:49:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.043 15:49:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:09.043 15:49:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:09.043 [2024-07-12 15:49:29.452613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20d84d0 00:13:09.981 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.240 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:10.499 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.499 "name": "raid_bdev1", 00:13:10.499 "uuid": "95270116-aa29-4d20-a81d-caccc66f17c9", 00:13:10.499 "strip_size_kb": 64, 00:13:10.499 "state": "online", 00:13:10.499 "raid_level": "concat", 00:13:10.499 "superblock": true, 00:13:10.499 "num_base_bdevs": 2, 00:13:10.499 "num_base_bdevs_discovered": 2, 00:13:10.499 "num_base_bdevs_operational": 2, 00:13:10.499 "base_bdevs_list": [ 00:13:10.499 { 00:13:10.499 "name": "BaseBdev1", 00:13:10.499 "uuid": "8b1a0147-adc0-58f5-a77c-20708f339066", 00:13:10.499 "is_configured": true, 00:13:10.499 "data_offset": 2048, 00:13:10.499 "data_size": 63488 00:13:10.499 }, 00:13:10.499 { 00:13:10.499 "name": "BaseBdev2", 00:13:10.499 "uuid": "381ab481-620b-5afe-a96f-04a673b84e9c", 00:13:10.499 "is_configured": true, 00:13:10.499 "data_offset": 2048, 00:13:10.499 "data_size": 63488 00:13:10.499 } 00:13:10.499 ] 00:13:10.499 }' 00:13:10.499 15:49:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.499 15:49:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:11.067 [2024-07-12 15:49:31.451516] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:11.067 [2024-07-12 15:49:31.451550] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:11.067 [2024-07-12 15:49:31.454134] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:11.067 [2024-07-12 15:49:31.454156] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:11.067 [2024-07-12 15:49:31.454176] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:11.067 [2024-07-12 15:49:31.454182] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20dd000 name raid_bdev1, state offline 00:13:11.067 0 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2511668 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2511668 ']' 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2511668 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:11.067 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2511668 00:13:11.326 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2511668' 00:13:11.327 killing process with pid 2511668 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2511668 00:13:11.327 [2024-07-12 15:49:31.541346] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2511668 00:13:11.327 [2024-07-12 15:49:31.546819] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.v7Xq68MyRz 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.50 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.50 != \0\.\0\0 ]] 00:13:11.327 00:13:11.327 real 0m7.385s 00:13:11.327 user 0m12.345s 00:13:11.327 sys 0m0.896s 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:11.327 15:49:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.327 ************************************ 00:13:11.327 END TEST raid_write_error_test 00:13:11.327 ************************************ 00:13:11.327 15:49:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:11.327 15:49:31 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:11.327 15:49:31 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:13:11.327 15:49:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:11.327 15:49:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:11.327 15:49:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:11.327 ************************************ 00:13:11.327 START TEST raid_state_function_test 00:13:11.327 ************************************ 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2513000 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2513000' 00:13:11.327 Process raid pid: 2513000 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2513000 /var/tmp/spdk-raid.sock 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2513000 ']' 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:11.327 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:11.327 15:49:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.586 [2024-07-12 15:49:31.830425] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:11.586 [2024-07-12 15:49:31.830494] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:11.586 [2024-07-12 15:49:31.926885] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:11.586 [2024-07-12 15:49:32.005991] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:11.846 [2024-07-12 15:49:32.047973] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:11.846 [2024-07-12 15:49:32.047997] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:12.416 15:49:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:12.417 [2024-07-12 15:49:32.844036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:12.417 [2024-07-12 15:49:32.844066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:12.417 [2024-07-12 15:49:32.844072] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:12.417 [2024-07-12 15:49:32.844077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.417 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:12.684 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.684 "name": "Existed_Raid", 00:13:12.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.684 "strip_size_kb": 0, 00:13:12.684 "state": "configuring", 00:13:12.684 "raid_level": "raid1", 00:13:12.684 "superblock": false, 00:13:12.684 "num_base_bdevs": 2, 00:13:12.684 "num_base_bdevs_discovered": 0, 00:13:12.684 "num_base_bdevs_operational": 2, 00:13:12.684 "base_bdevs_list": [ 00:13:12.684 { 00:13:12.684 "name": "BaseBdev1", 00:13:12.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.684 "is_configured": false, 00:13:12.684 "data_offset": 0, 00:13:12.684 "data_size": 0 00:13:12.684 }, 00:13:12.684 { 00:13:12.684 "name": "BaseBdev2", 00:13:12.684 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:12.684 "is_configured": false, 00:13:12.684 "data_offset": 0, 00:13:12.684 "data_size": 0 00:13:12.684 } 00:13:12.684 ] 00:13:12.684 }' 00:13:12.684 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.684 15:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.257 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:13.517 [2024-07-12 15:49:33.746223] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:13.517 [2024-07-12 15:49:33.746240] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefa900 name Existed_Raid, state configuring 00:13:13.517 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:13.517 [2024-07-12 15:49:33.906639] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:13.518 [2024-07-12 15:49:33.906656] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:13.518 [2024-07-12 15:49:33.906661] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:13.518 [2024-07-12 15:49:33.906667] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:13.518 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:13.778 [2024-07-12 15:49:34.105677] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:13.778 BaseBdev1 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:13.778 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:14.038 [ 00:13:14.038 { 00:13:14.038 "name": "BaseBdev1", 00:13:14.038 "aliases": [ 00:13:14.038 "b342f206-9135-4337-ab44-0493eeb7bff5" 00:13:14.038 ], 00:13:14.038 "product_name": "Malloc disk", 00:13:14.038 "block_size": 512, 00:13:14.038 "num_blocks": 65536, 00:13:14.038 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:14.038 "assigned_rate_limits": { 00:13:14.038 "rw_ios_per_sec": 0, 00:13:14.038 "rw_mbytes_per_sec": 0, 00:13:14.038 "r_mbytes_per_sec": 0, 00:13:14.038 "w_mbytes_per_sec": 0 00:13:14.038 }, 00:13:14.038 "claimed": true, 00:13:14.038 "claim_type": "exclusive_write", 00:13:14.038 "zoned": false, 00:13:14.038 "supported_io_types": { 00:13:14.038 "read": true, 00:13:14.038 "write": true, 00:13:14.038 "unmap": true, 00:13:14.038 "flush": true, 00:13:14.038 "reset": true, 00:13:14.038 "nvme_admin": false, 00:13:14.038 "nvme_io": false, 00:13:14.038 "nvme_io_md": false, 00:13:14.038 "write_zeroes": true, 00:13:14.038 "zcopy": true, 00:13:14.038 "get_zone_info": false, 00:13:14.038 "zone_management": false, 00:13:14.038 "zone_append": false, 00:13:14.038 "compare": false, 00:13:14.038 "compare_and_write": false, 00:13:14.038 "abort": true, 00:13:14.038 "seek_hole": false, 00:13:14.038 "seek_data": false, 00:13:14.038 "copy": true, 00:13:14.038 "nvme_iov_md": false 00:13:14.038 }, 00:13:14.038 "memory_domains": [ 00:13:14.038 { 00:13:14.038 "dma_device_id": "system", 00:13:14.038 "dma_device_type": 1 00:13:14.038 }, 00:13:14.038 { 00:13:14.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:14.038 "dma_device_type": 2 00:13:14.038 } 00:13:14.038 ], 00:13:14.038 "driver_specific": {} 00:13:14.038 } 00:13:14.038 ] 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.038 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:14.299 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:14.299 "name": "Existed_Raid", 00:13:14.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.299 "strip_size_kb": 0, 00:13:14.299 "state": "configuring", 00:13:14.299 "raid_level": "raid1", 00:13:14.299 "superblock": false, 00:13:14.299 "num_base_bdevs": 2, 00:13:14.299 "num_base_bdevs_discovered": 1, 00:13:14.299 "num_base_bdevs_operational": 2, 00:13:14.299 "base_bdevs_list": [ 00:13:14.299 { 00:13:14.299 "name": "BaseBdev1", 00:13:14.299 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:14.299 "is_configured": true, 00:13:14.299 "data_offset": 0, 00:13:14.299 "data_size": 65536 00:13:14.299 }, 00:13:14.299 { 00:13:14.299 "name": "BaseBdev2", 00:13:14.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:14.299 "is_configured": false, 00:13:14.299 "data_offset": 0, 00:13:14.299 "data_size": 0 00:13:14.299 } 00:13:14.299 ] 00:13:14.299 }' 00:13:14.299 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:14.299 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.954 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:14.954 [2024-07-12 15:49:35.328766] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:14.954 [2024-07-12 15:49:35.328798] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefa1d0 name Existed_Raid, state configuring 00:13:14.954 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:15.214 [2024-07-12 15:49:35.489188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:15.214 [2024-07-12 15:49:35.490327] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:15.214 [2024-07-12 15:49:35.490351] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.214 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.474 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.474 "name": "Existed_Raid", 00:13:15.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.474 "strip_size_kb": 0, 00:13:15.474 "state": "configuring", 00:13:15.474 "raid_level": "raid1", 00:13:15.474 "superblock": false, 00:13:15.474 "num_base_bdevs": 2, 00:13:15.474 "num_base_bdevs_discovered": 1, 00:13:15.474 "num_base_bdevs_operational": 2, 00:13:15.474 "base_bdevs_list": [ 00:13:15.474 { 00:13:15.474 "name": "BaseBdev1", 00:13:15.474 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:15.474 "is_configured": true, 00:13:15.474 "data_offset": 0, 00:13:15.474 "data_size": 65536 00:13:15.474 }, 00:13:15.474 { 00:13:15.474 "name": "BaseBdev2", 00:13:15.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.474 "is_configured": false, 00:13:15.474 "data_offset": 0, 00:13:15.474 "data_size": 0 00:13:15.474 } 00:13:15.474 ] 00:13:15.474 }' 00:13:15.474 15:49:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.474 15:49:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:16.045 [2024-07-12 15:49:36.384386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:16.045 [2024-07-12 15:49:36.384413] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xefae80 00:13:16.045 [2024-07-12 15:49:36.384417] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:16.045 [2024-07-12 15:49:36.384563] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf9290 00:13:16.045 [2024-07-12 15:49:36.384654] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefae80 00:13:16.045 [2024-07-12 15:49:36.384660] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xefae80 00:13:16.045 [2024-07-12 15:49:36.384792] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:16.045 BaseBdev2 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:16.045 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.305 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:16.566 [ 00:13:16.566 { 00:13:16.566 "name": "BaseBdev2", 00:13:16.566 "aliases": [ 00:13:16.566 "6e6b1b13-ca12-475a-939d-95c6cc72ef96" 00:13:16.566 ], 00:13:16.566 "product_name": "Malloc disk", 00:13:16.566 "block_size": 512, 00:13:16.566 "num_blocks": 65536, 00:13:16.566 "uuid": "6e6b1b13-ca12-475a-939d-95c6cc72ef96", 00:13:16.566 "assigned_rate_limits": { 00:13:16.566 "rw_ios_per_sec": 0, 00:13:16.566 "rw_mbytes_per_sec": 0, 00:13:16.566 "r_mbytes_per_sec": 0, 00:13:16.566 "w_mbytes_per_sec": 0 00:13:16.566 }, 00:13:16.566 "claimed": true, 00:13:16.566 "claim_type": "exclusive_write", 00:13:16.566 "zoned": false, 00:13:16.566 "supported_io_types": { 00:13:16.566 "read": true, 00:13:16.566 "write": true, 00:13:16.566 "unmap": true, 00:13:16.566 "flush": true, 00:13:16.566 "reset": true, 00:13:16.566 "nvme_admin": false, 00:13:16.566 "nvme_io": false, 00:13:16.566 "nvme_io_md": false, 00:13:16.566 "write_zeroes": true, 00:13:16.566 "zcopy": true, 00:13:16.566 "get_zone_info": false, 00:13:16.566 "zone_management": false, 00:13:16.566 "zone_append": false, 00:13:16.566 "compare": false, 00:13:16.566 "compare_and_write": false, 00:13:16.566 "abort": true, 00:13:16.566 "seek_hole": false, 00:13:16.566 "seek_data": false, 00:13:16.566 "copy": true, 00:13:16.566 "nvme_iov_md": false 00:13:16.566 }, 00:13:16.566 "memory_domains": [ 00:13:16.566 { 00:13:16.566 "dma_device_id": "system", 00:13:16.566 "dma_device_type": 1 00:13:16.566 }, 00:13:16.566 { 00:13:16.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.566 "dma_device_type": 2 00:13:16.566 } 00:13:16.566 ], 00:13:16.566 "driver_specific": {} 00:13:16.566 } 00:13:16.566 ] 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.566 "name": "Existed_Raid", 00:13:16.566 "uuid": "ecb4ef95-369b-470c-80c8-60c0eed00e9b", 00:13:16.566 "strip_size_kb": 0, 00:13:16.566 "state": "online", 00:13:16.566 "raid_level": "raid1", 00:13:16.566 "superblock": false, 00:13:16.566 "num_base_bdevs": 2, 00:13:16.566 "num_base_bdevs_discovered": 2, 00:13:16.566 "num_base_bdevs_operational": 2, 00:13:16.566 "base_bdevs_list": [ 00:13:16.566 { 00:13:16.566 "name": "BaseBdev1", 00:13:16.566 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:16.566 "is_configured": true, 00:13:16.566 "data_offset": 0, 00:13:16.566 "data_size": 65536 00:13:16.566 }, 00:13:16.566 { 00:13:16.566 "name": "BaseBdev2", 00:13:16.566 "uuid": "6e6b1b13-ca12-475a-939d-95c6cc72ef96", 00:13:16.566 "is_configured": true, 00:13:16.566 "data_offset": 0, 00:13:16.566 "data_size": 65536 00:13:16.566 } 00:13:16.566 ] 00:13:16.566 }' 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.566 15:49:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:17.136 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:17.397 [2024-07-12 15:49:37.683894] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:17.398 "name": "Existed_Raid", 00:13:17.398 "aliases": [ 00:13:17.398 "ecb4ef95-369b-470c-80c8-60c0eed00e9b" 00:13:17.398 ], 00:13:17.398 "product_name": "Raid Volume", 00:13:17.398 "block_size": 512, 00:13:17.398 "num_blocks": 65536, 00:13:17.398 "uuid": "ecb4ef95-369b-470c-80c8-60c0eed00e9b", 00:13:17.398 "assigned_rate_limits": { 00:13:17.398 "rw_ios_per_sec": 0, 00:13:17.398 "rw_mbytes_per_sec": 0, 00:13:17.398 "r_mbytes_per_sec": 0, 00:13:17.398 "w_mbytes_per_sec": 0 00:13:17.398 }, 00:13:17.398 "claimed": false, 00:13:17.398 "zoned": false, 00:13:17.398 "supported_io_types": { 00:13:17.398 "read": true, 00:13:17.398 "write": true, 00:13:17.398 "unmap": false, 00:13:17.398 "flush": false, 00:13:17.398 "reset": true, 00:13:17.398 "nvme_admin": false, 00:13:17.398 "nvme_io": false, 00:13:17.398 "nvme_io_md": false, 00:13:17.398 "write_zeroes": true, 00:13:17.398 "zcopy": false, 00:13:17.398 "get_zone_info": false, 00:13:17.398 "zone_management": false, 00:13:17.398 "zone_append": false, 00:13:17.398 "compare": false, 00:13:17.398 "compare_and_write": false, 00:13:17.398 "abort": false, 00:13:17.398 "seek_hole": false, 00:13:17.398 "seek_data": false, 00:13:17.398 "copy": false, 00:13:17.398 "nvme_iov_md": false 00:13:17.398 }, 00:13:17.398 "memory_domains": [ 00:13:17.398 { 00:13:17.398 "dma_device_id": "system", 00:13:17.398 "dma_device_type": 1 00:13:17.398 }, 00:13:17.398 { 00:13:17.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.398 "dma_device_type": 2 00:13:17.398 }, 00:13:17.398 { 00:13:17.398 "dma_device_id": "system", 00:13:17.398 "dma_device_type": 1 00:13:17.398 }, 00:13:17.398 { 00:13:17.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.398 "dma_device_type": 2 00:13:17.398 } 00:13:17.398 ], 00:13:17.398 "driver_specific": { 00:13:17.398 "raid": { 00:13:17.398 "uuid": "ecb4ef95-369b-470c-80c8-60c0eed00e9b", 00:13:17.398 "strip_size_kb": 0, 00:13:17.398 "state": "online", 00:13:17.398 "raid_level": "raid1", 00:13:17.398 "superblock": false, 00:13:17.398 "num_base_bdevs": 2, 00:13:17.398 "num_base_bdevs_discovered": 2, 00:13:17.398 "num_base_bdevs_operational": 2, 00:13:17.398 "base_bdevs_list": [ 00:13:17.398 { 00:13:17.398 "name": "BaseBdev1", 00:13:17.398 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:17.398 "is_configured": true, 00:13:17.398 "data_offset": 0, 00:13:17.398 "data_size": 65536 00:13:17.398 }, 00:13:17.398 { 00:13:17.398 "name": "BaseBdev2", 00:13:17.398 "uuid": "6e6b1b13-ca12-475a-939d-95c6cc72ef96", 00:13:17.398 "is_configured": true, 00:13:17.398 "data_offset": 0, 00:13:17.398 "data_size": 65536 00:13:17.398 } 00:13:17.398 ] 00:13:17.398 } 00:13:17.398 } 00:13:17.398 }' 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:17.398 BaseBdev2' 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:17.398 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:17.658 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:17.658 "name": "BaseBdev1", 00:13:17.658 "aliases": [ 00:13:17.658 "b342f206-9135-4337-ab44-0493eeb7bff5" 00:13:17.658 ], 00:13:17.658 "product_name": "Malloc disk", 00:13:17.658 "block_size": 512, 00:13:17.658 "num_blocks": 65536, 00:13:17.658 "uuid": "b342f206-9135-4337-ab44-0493eeb7bff5", 00:13:17.658 "assigned_rate_limits": { 00:13:17.658 "rw_ios_per_sec": 0, 00:13:17.658 "rw_mbytes_per_sec": 0, 00:13:17.658 "r_mbytes_per_sec": 0, 00:13:17.658 "w_mbytes_per_sec": 0 00:13:17.658 }, 00:13:17.658 "claimed": true, 00:13:17.658 "claim_type": "exclusive_write", 00:13:17.658 "zoned": false, 00:13:17.658 "supported_io_types": { 00:13:17.658 "read": true, 00:13:17.658 "write": true, 00:13:17.658 "unmap": true, 00:13:17.658 "flush": true, 00:13:17.658 "reset": true, 00:13:17.658 "nvme_admin": false, 00:13:17.658 "nvme_io": false, 00:13:17.658 "nvme_io_md": false, 00:13:17.658 "write_zeroes": true, 00:13:17.658 "zcopy": true, 00:13:17.658 "get_zone_info": false, 00:13:17.658 "zone_management": false, 00:13:17.658 "zone_append": false, 00:13:17.658 "compare": false, 00:13:17.658 "compare_and_write": false, 00:13:17.658 "abort": true, 00:13:17.658 "seek_hole": false, 00:13:17.658 "seek_data": false, 00:13:17.658 "copy": true, 00:13:17.658 "nvme_iov_md": false 00:13:17.658 }, 00:13:17.658 "memory_domains": [ 00:13:17.658 { 00:13:17.658 "dma_device_id": "system", 00:13:17.658 "dma_device_type": 1 00:13:17.658 }, 00:13:17.658 { 00:13:17.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.658 "dma_device_type": 2 00:13:17.658 } 00:13:17.658 ], 00:13:17.658 "driver_specific": {} 00:13:17.658 }' 00:13:17.658 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.658 15:49:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:17.659 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:17.659 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.659 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:17.919 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.180 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.180 "name": "BaseBdev2", 00:13:18.180 "aliases": [ 00:13:18.180 "6e6b1b13-ca12-475a-939d-95c6cc72ef96" 00:13:18.180 ], 00:13:18.180 "product_name": "Malloc disk", 00:13:18.180 "block_size": 512, 00:13:18.180 "num_blocks": 65536, 00:13:18.180 "uuid": "6e6b1b13-ca12-475a-939d-95c6cc72ef96", 00:13:18.180 "assigned_rate_limits": { 00:13:18.180 "rw_ios_per_sec": 0, 00:13:18.180 "rw_mbytes_per_sec": 0, 00:13:18.180 "r_mbytes_per_sec": 0, 00:13:18.180 "w_mbytes_per_sec": 0 00:13:18.180 }, 00:13:18.180 "claimed": true, 00:13:18.180 "claim_type": "exclusive_write", 00:13:18.180 "zoned": false, 00:13:18.180 "supported_io_types": { 00:13:18.180 "read": true, 00:13:18.180 "write": true, 00:13:18.180 "unmap": true, 00:13:18.180 "flush": true, 00:13:18.180 "reset": true, 00:13:18.180 "nvme_admin": false, 00:13:18.180 "nvme_io": false, 00:13:18.180 "nvme_io_md": false, 00:13:18.180 "write_zeroes": true, 00:13:18.180 "zcopy": true, 00:13:18.180 "get_zone_info": false, 00:13:18.180 "zone_management": false, 00:13:18.180 "zone_append": false, 00:13:18.180 "compare": false, 00:13:18.180 "compare_and_write": false, 00:13:18.180 "abort": true, 00:13:18.180 "seek_hole": false, 00:13:18.180 "seek_data": false, 00:13:18.180 "copy": true, 00:13:18.180 "nvme_iov_md": false 00:13:18.180 }, 00:13:18.180 "memory_domains": [ 00:13:18.180 { 00:13:18.180 "dma_device_id": "system", 00:13:18.180 "dma_device_type": 1 00:13:18.180 }, 00:13:18.180 { 00:13:18.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.180 "dma_device_type": 2 00:13:18.180 } 00:13:18.180 ], 00:13:18.180 "driver_specific": {} 00:13:18.180 }' 00:13:18.180 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.180 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.180 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.180 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.441 15:49:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:18.701 [2024-07-12 15:49:39.019079] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.701 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:18.961 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.961 "name": "Existed_Raid", 00:13:18.961 "uuid": "ecb4ef95-369b-470c-80c8-60c0eed00e9b", 00:13:18.961 "strip_size_kb": 0, 00:13:18.961 "state": "online", 00:13:18.961 "raid_level": "raid1", 00:13:18.961 "superblock": false, 00:13:18.961 "num_base_bdevs": 2, 00:13:18.961 "num_base_bdevs_discovered": 1, 00:13:18.961 "num_base_bdevs_operational": 1, 00:13:18.961 "base_bdevs_list": [ 00:13:18.961 { 00:13:18.961 "name": null, 00:13:18.961 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:18.961 "is_configured": false, 00:13:18.961 "data_offset": 0, 00:13:18.961 "data_size": 65536 00:13:18.961 }, 00:13:18.961 { 00:13:18.961 "name": "BaseBdev2", 00:13:18.961 "uuid": "6e6b1b13-ca12-475a-939d-95c6cc72ef96", 00:13:18.961 "is_configured": true, 00:13:18.961 "data_offset": 0, 00:13:18.961 "data_size": 65536 00:13:18.961 } 00:13:18.961 ] 00:13:18.961 }' 00:13:18.961 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.961 15:49:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:19.532 15:49:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:19.792 [2024-07-12 15:49:40.125883] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:19.792 [2024-07-12 15:49:40.125950] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:19.792 [2024-07-12 15:49:40.131838] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:19.792 [2024-07-12 15:49:40.131867] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:19.792 [2024-07-12 15:49:40.131874] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefae80 name Existed_Raid, state offline 00:13:19.792 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:19.792 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:19.792 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.792 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2513000 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2513000 ']' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2513000 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2513000 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2513000' 00:13:20.053 killing process with pid 2513000 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2513000 00:13:20.053 [2024-07-12 15:49:40.372251] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2513000 00:13:20.053 [2024-07-12 15:49:40.372850] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:20.053 00:13:20.053 real 0m8.733s 00:13:20.053 user 0m15.823s 00:13:20.053 sys 0m1.351s 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:20.053 15:49:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.053 ************************************ 00:13:20.053 END TEST raid_state_function_test 00:13:20.053 ************************************ 00:13:20.314 15:49:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:20.314 15:49:40 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:20.314 15:49:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:20.314 15:49:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:20.314 15:49:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:20.315 ************************************ 00:13:20.315 START TEST raid_state_function_test_sb 00:13:20.315 ************************************ 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2514747 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2514747' 00:13:20.315 Process raid pid: 2514747 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2514747 /var/tmp/spdk-raid.sock 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2514747 ']' 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:20.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:20.315 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:20.315 [2024-07-12 15:49:40.629310] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:20.315 [2024-07-12 15:49:40.629352] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:20.315 [2024-07-12 15:49:40.714790] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:20.575 [2024-07-12 15:49:40.777231] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:20.575 [2024-07-12 15:49:40.817859] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:20.575 [2024-07-12 15:49:40.817881] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:21.145 15:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:21.145 15:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:21.145 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:21.404 [2024-07-12 15:49:41.633338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:21.404 [2024-07-12 15:49:41.633366] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:21.404 [2024-07-12 15:49:41.633372] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:21.404 [2024-07-12 15:49:41.633378] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.405 "name": "Existed_Raid", 00:13:21.405 "uuid": "7f2007f2-1d5e-44dd-94cd-37de211263ee", 00:13:21.405 "strip_size_kb": 0, 00:13:21.405 "state": "configuring", 00:13:21.405 "raid_level": "raid1", 00:13:21.405 "superblock": true, 00:13:21.405 "num_base_bdevs": 2, 00:13:21.405 "num_base_bdevs_discovered": 0, 00:13:21.405 "num_base_bdevs_operational": 2, 00:13:21.405 "base_bdevs_list": [ 00:13:21.405 { 00:13:21.405 "name": "BaseBdev1", 00:13:21.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.405 "is_configured": false, 00:13:21.405 "data_offset": 0, 00:13:21.405 "data_size": 0 00:13:21.405 }, 00:13:21.405 { 00:13:21.405 "name": "BaseBdev2", 00:13:21.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.405 "is_configured": false, 00:13:21.405 "data_offset": 0, 00:13:21.405 "data_size": 0 00:13:21.405 } 00:13:21.405 ] 00:13:21.405 }' 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.405 15:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.974 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:22.233 [2024-07-12 15:49:42.543533] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:22.233 [2024-07-12 15:49:42.543548] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f76900 name Existed_Raid, state configuring 00:13:22.233 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:22.494 [2024-07-12 15:49:42.740044] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:22.494 [2024-07-12 15:49:42.740063] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:22.494 [2024-07-12 15:49:42.740068] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:22.494 [2024-07-12 15:49:42.740073] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:22.494 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:22.494 [2024-07-12 15:49:42.931095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:22.494 BaseBdev1 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:22.756 15:49:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:22.756 15:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:23.016 [ 00:13:23.016 { 00:13:23.016 "name": "BaseBdev1", 00:13:23.016 "aliases": [ 00:13:23.016 "0741b30b-3678-441c-b07a-566bfb69ab33" 00:13:23.016 ], 00:13:23.016 "product_name": "Malloc disk", 00:13:23.016 "block_size": 512, 00:13:23.016 "num_blocks": 65536, 00:13:23.016 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:23.016 "assigned_rate_limits": { 00:13:23.016 "rw_ios_per_sec": 0, 00:13:23.016 "rw_mbytes_per_sec": 0, 00:13:23.016 "r_mbytes_per_sec": 0, 00:13:23.016 "w_mbytes_per_sec": 0 00:13:23.016 }, 00:13:23.016 "claimed": true, 00:13:23.017 "claim_type": "exclusive_write", 00:13:23.017 "zoned": false, 00:13:23.017 "supported_io_types": { 00:13:23.017 "read": true, 00:13:23.017 "write": true, 00:13:23.017 "unmap": true, 00:13:23.017 "flush": true, 00:13:23.017 "reset": true, 00:13:23.017 "nvme_admin": false, 00:13:23.017 "nvme_io": false, 00:13:23.017 "nvme_io_md": false, 00:13:23.017 "write_zeroes": true, 00:13:23.017 "zcopy": true, 00:13:23.017 "get_zone_info": false, 00:13:23.017 "zone_management": false, 00:13:23.017 "zone_append": false, 00:13:23.017 "compare": false, 00:13:23.017 "compare_and_write": false, 00:13:23.017 "abort": true, 00:13:23.017 "seek_hole": false, 00:13:23.017 "seek_data": false, 00:13:23.017 "copy": true, 00:13:23.017 "nvme_iov_md": false 00:13:23.017 }, 00:13:23.017 "memory_domains": [ 00:13:23.017 { 00:13:23.017 "dma_device_id": "system", 00:13:23.017 "dma_device_type": 1 00:13:23.017 }, 00:13:23.017 { 00:13:23.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:23.017 "dma_device_type": 2 00:13:23.017 } 00:13:23.017 ], 00:13:23.017 "driver_specific": {} 00:13:23.017 } 00:13:23.017 ] 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.017 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.276 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.276 "name": "Existed_Raid", 00:13:23.276 "uuid": "fd9361bb-55c0-4639-b298-424ef294d50a", 00:13:23.276 "strip_size_kb": 0, 00:13:23.276 "state": "configuring", 00:13:23.276 "raid_level": "raid1", 00:13:23.276 "superblock": true, 00:13:23.276 "num_base_bdevs": 2, 00:13:23.276 "num_base_bdevs_discovered": 1, 00:13:23.276 "num_base_bdevs_operational": 2, 00:13:23.276 "base_bdevs_list": [ 00:13:23.276 { 00:13:23.276 "name": "BaseBdev1", 00:13:23.276 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:23.276 "is_configured": true, 00:13:23.276 "data_offset": 2048, 00:13:23.276 "data_size": 63488 00:13:23.276 }, 00:13:23.276 { 00:13:23.276 "name": "BaseBdev2", 00:13:23.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.276 "is_configured": false, 00:13:23.276 "data_offset": 0, 00:13:23.276 "data_size": 0 00:13:23.276 } 00:13:23.276 ] 00:13:23.276 }' 00:13:23.276 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.276 15:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:23.846 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:23.846 [2024-07-12 15:49:44.250417] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:23.846 [2024-07-12 15:49:44.250443] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f761d0 name Existed_Raid, state configuring 00:13:23.846 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:24.106 [2024-07-12 15:49:44.442934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:24.106 [2024-07-12 15:49:44.444055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:24.106 [2024-07-12 15:49:44.444077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.106 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.366 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.366 "name": "Existed_Raid", 00:13:24.366 "uuid": "dc21820e-6fc3-4dbc-936c-47a8fde16885", 00:13:24.366 "strip_size_kb": 0, 00:13:24.366 "state": "configuring", 00:13:24.366 "raid_level": "raid1", 00:13:24.366 "superblock": true, 00:13:24.366 "num_base_bdevs": 2, 00:13:24.366 "num_base_bdevs_discovered": 1, 00:13:24.366 "num_base_bdevs_operational": 2, 00:13:24.366 "base_bdevs_list": [ 00:13:24.366 { 00:13:24.366 "name": "BaseBdev1", 00:13:24.366 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:24.366 "is_configured": true, 00:13:24.366 "data_offset": 2048, 00:13:24.366 "data_size": 63488 00:13:24.366 }, 00:13:24.366 { 00:13:24.366 "name": "BaseBdev2", 00:13:24.366 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.366 "is_configured": false, 00:13:24.366 "data_offset": 0, 00:13:24.366 "data_size": 0 00:13:24.366 } 00:13:24.366 ] 00:13:24.366 }' 00:13:24.366 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.366 15:49:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:24.936 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:24.936 [2024-07-12 15:49:45.382333] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:24.936 [2024-07-12 15:49:45.382443] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f76e80 00:13:24.936 [2024-07-12 15:49:45.382451] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:24.936 [2024-07-12 15:49:45.382585] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c75290 00:13:24.936 [2024-07-12 15:49:45.382676] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f76e80 00:13:24.936 [2024-07-12 15:49:45.382682] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f76e80 00:13:24.936 [2024-07-12 15:49:45.382755] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:24.936 BaseBdev2 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:25.197 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:25.457 [ 00:13:25.457 { 00:13:25.457 "name": "BaseBdev2", 00:13:25.457 "aliases": [ 00:13:25.457 "900e673b-1bab-4242-bcc2-2de4c574ec7b" 00:13:25.457 ], 00:13:25.457 "product_name": "Malloc disk", 00:13:25.457 "block_size": 512, 00:13:25.457 "num_blocks": 65536, 00:13:25.457 "uuid": "900e673b-1bab-4242-bcc2-2de4c574ec7b", 00:13:25.457 "assigned_rate_limits": { 00:13:25.457 "rw_ios_per_sec": 0, 00:13:25.457 "rw_mbytes_per_sec": 0, 00:13:25.457 "r_mbytes_per_sec": 0, 00:13:25.457 "w_mbytes_per_sec": 0 00:13:25.457 }, 00:13:25.457 "claimed": true, 00:13:25.457 "claim_type": "exclusive_write", 00:13:25.457 "zoned": false, 00:13:25.457 "supported_io_types": { 00:13:25.457 "read": true, 00:13:25.457 "write": true, 00:13:25.457 "unmap": true, 00:13:25.457 "flush": true, 00:13:25.457 "reset": true, 00:13:25.457 "nvme_admin": false, 00:13:25.457 "nvme_io": false, 00:13:25.457 "nvme_io_md": false, 00:13:25.457 "write_zeroes": true, 00:13:25.457 "zcopy": true, 00:13:25.457 "get_zone_info": false, 00:13:25.457 "zone_management": false, 00:13:25.457 "zone_append": false, 00:13:25.457 "compare": false, 00:13:25.457 "compare_and_write": false, 00:13:25.457 "abort": true, 00:13:25.457 "seek_hole": false, 00:13:25.457 "seek_data": false, 00:13:25.457 "copy": true, 00:13:25.457 "nvme_iov_md": false 00:13:25.457 }, 00:13:25.457 "memory_domains": [ 00:13:25.457 { 00:13:25.457 "dma_device_id": "system", 00:13:25.457 "dma_device_type": 1 00:13:25.457 }, 00:13:25.457 { 00:13:25.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.457 "dma_device_type": 2 00:13:25.457 } 00:13:25.457 ], 00:13:25.457 "driver_specific": {} 00:13:25.457 } 00:13:25.457 ] 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.458 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.741 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.741 "name": "Existed_Raid", 00:13:25.741 "uuid": "dc21820e-6fc3-4dbc-936c-47a8fde16885", 00:13:25.741 "strip_size_kb": 0, 00:13:25.741 "state": "online", 00:13:25.741 "raid_level": "raid1", 00:13:25.741 "superblock": true, 00:13:25.741 "num_base_bdevs": 2, 00:13:25.741 "num_base_bdevs_discovered": 2, 00:13:25.741 "num_base_bdevs_operational": 2, 00:13:25.741 "base_bdevs_list": [ 00:13:25.741 { 00:13:25.741 "name": "BaseBdev1", 00:13:25.741 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:25.741 "is_configured": true, 00:13:25.741 "data_offset": 2048, 00:13:25.741 "data_size": 63488 00:13:25.741 }, 00:13:25.741 { 00:13:25.741 "name": "BaseBdev2", 00:13:25.741 "uuid": "900e673b-1bab-4242-bcc2-2de4c574ec7b", 00:13:25.741 "is_configured": true, 00:13:25.741 "data_offset": 2048, 00:13:25.741 "data_size": 63488 00:13:25.741 } 00:13:25.741 ] 00:13:25.741 }' 00:13:25.741 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.741 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:26.309 [2024-07-12 15:49:46.665786] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.309 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:26.309 "name": "Existed_Raid", 00:13:26.309 "aliases": [ 00:13:26.310 "dc21820e-6fc3-4dbc-936c-47a8fde16885" 00:13:26.310 ], 00:13:26.310 "product_name": "Raid Volume", 00:13:26.310 "block_size": 512, 00:13:26.310 "num_blocks": 63488, 00:13:26.310 "uuid": "dc21820e-6fc3-4dbc-936c-47a8fde16885", 00:13:26.310 "assigned_rate_limits": { 00:13:26.310 "rw_ios_per_sec": 0, 00:13:26.310 "rw_mbytes_per_sec": 0, 00:13:26.310 "r_mbytes_per_sec": 0, 00:13:26.310 "w_mbytes_per_sec": 0 00:13:26.310 }, 00:13:26.310 "claimed": false, 00:13:26.310 "zoned": false, 00:13:26.310 "supported_io_types": { 00:13:26.310 "read": true, 00:13:26.310 "write": true, 00:13:26.310 "unmap": false, 00:13:26.310 "flush": false, 00:13:26.310 "reset": true, 00:13:26.310 "nvme_admin": false, 00:13:26.310 "nvme_io": false, 00:13:26.310 "nvme_io_md": false, 00:13:26.310 "write_zeroes": true, 00:13:26.310 "zcopy": false, 00:13:26.310 "get_zone_info": false, 00:13:26.310 "zone_management": false, 00:13:26.310 "zone_append": false, 00:13:26.310 "compare": false, 00:13:26.310 "compare_and_write": false, 00:13:26.310 "abort": false, 00:13:26.310 "seek_hole": false, 00:13:26.310 "seek_data": false, 00:13:26.310 "copy": false, 00:13:26.310 "nvme_iov_md": false 00:13:26.310 }, 00:13:26.310 "memory_domains": [ 00:13:26.310 { 00:13:26.310 "dma_device_id": "system", 00:13:26.310 "dma_device_type": 1 00:13:26.310 }, 00:13:26.310 { 00:13:26.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.310 "dma_device_type": 2 00:13:26.310 }, 00:13:26.310 { 00:13:26.310 "dma_device_id": "system", 00:13:26.310 "dma_device_type": 1 00:13:26.310 }, 00:13:26.310 { 00:13:26.310 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.310 "dma_device_type": 2 00:13:26.310 } 00:13:26.310 ], 00:13:26.310 "driver_specific": { 00:13:26.310 "raid": { 00:13:26.310 "uuid": "dc21820e-6fc3-4dbc-936c-47a8fde16885", 00:13:26.310 "strip_size_kb": 0, 00:13:26.310 "state": "online", 00:13:26.310 "raid_level": "raid1", 00:13:26.310 "superblock": true, 00:13:26.310 "num_base_bdevs": 2, 00:13:26.310 "num_base_bdevs_discovered": 2, 00:13:26.310 "num_base_bdevs_operational": 2, 00:13:26.310 "base_bdevs_list": [ 00:13:26.310 { 00:13:26.310 "name": "BaseBdev1", 00:13:26.310 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:26.310 "is_configured": true, 00:13:26.310 "data_offset": 2048, 00:13:26.310 "data_size": 63488 00:13:26.310 }, 00:13:26.310 { 00:13:26.310 "name": "BaseBdev2", 00:13:26.310 "uuid": "900e673b-1bab-4242-bcc2-2de4c574ec7b", 00:13:26.310 "is_configured": true, 00:13:26.310 "data_offset": 2048, 00:13:26.310 "data_size": 63488 00:13:26.310 } 00:13:26.310 ] 00:13:26.310 } 00:13:26.310 } 00:13:26.310 }' 00:13:26.310 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:26.310 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:26.310 BaseBdev2' 00:13:26.310 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:26.310 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:26.310 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:26.570 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.570 "name": "BaseBdev1", 00:13:26.570 "aliases": [ 00:13:26.570 "0741b30b-3678-441c-b07a-566bfb69ab33" 00:13:26.570 ], 00:13:26.570 "product_name": "Malloc disk", 00:13:26.570 "block_size": 512, 00:13:26.570 "num_blocks": 65536, 00:13:26.570 "uuid": "0741b30b-3678-441c-b07a-566bfb69ab33", 00:13:26.570 "assigned_rate_limits": { 00:13:26.570 "rw_ios_per_sec": 0, 00:13:26.570 "rw_mbytes_per_sec": 0, 00:13:26.570 "r_mbytes_per_sec": 0, 00:13:26.570 "w_mbytes_per_sec": 0 00:13:26.570 }, 00:13:26.570 "claimed": true, 00:13:26.570 "claim_type": "exclusive_write", 00:13:26.570 "zoned": false, 00:13:26.570 "supported_io_types": { 00:13:26.570 "read": true, 00:13:26.570 "write": true, 00:13:26.570 "unmap": true, 00:13:26.570 "flush": true, 00:13:26.570 "reset": true, 00:13:26.570 "nvme_admin": false, 00:13:26.570 "nvme_io": false, 00:13:26.570 "nvme_io_md": false, 00:13:26.570 "write_zeroes": true, 00:13:26.570 "zcopy": true, 00:13:26.570 "get_zone_info": false, 00:13:26.570 "zone_management": false, 00:13:26.570 "zone_append": false, 00:13:26.570 "compare": false, 00:13:26.570 "compare_and_write": false, 00:13:26.570 "abort": true, 00:13:26.570 "seek_hole": false, 00:13:26.570 "seek_data": false, 00:13:26.570 "copy": true, 00:13:26.570 "nvme_iov_md": false 00:13:26.570 }, 00:13:26.570 "memory_domains": [ 00:13:26.570 { 00:13:26.570 "dma_device_id": "system", 00:13:26.570 "dma_device_type": 1 00:13:26.570 }, 00:13:26.570 { 00:13:26.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.570 "dma_device_type": 2 00:13:26.570 } 00:13:26.570 ], 00:13:26.570 "driver_specific": {} 00:13:26.570 }' 00:13:26.570 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.570 15:49:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.570 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.570 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:26.830 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.090 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.090 "name": "BaseBdev2", 00:13:27.090 "aliases": [ 00:13:27.090 "900e673b-1bab-4242-bcc2-2de4c574ec7b" 00:13:27.090 ], 00:13:27.090 "product_name": "Malloc disk", 00:13:27.090 "block_size": 512, 00:13:27.090 "num_blocks": 65536, 00:13:27.090 "uuid": "900e673b-1bab-4242-bcc2-2de4c574ec7b", 00:13:27.090 "assigned_rate_limits": { 00:13:27.090 "rw_ios_per_sec": 0, 00:13:27.090 "rw_mbytes_per_sec": 0, 00:13:27.090 "r_mbytes_per_sec": 0, 00:13:27.090 "w_mbytes_per_sec": 0 00:13:27.090 }, 00:13:27.090 "claimed": true, 00:13:27.090 "claim_type": "exclusive_write", 00:13:27.090 "zoned": false, 00:13:27.090 "supported_io_types": { 00:13:27.090 "read": true, 00:13:27.090 "write": true, 00:13:27.090 "unmap": true, 00:13:27.090 "flush": true, 00:13:27.090 "reset": true, 00:13:27.090 "nvme_admin": false, 00:13:27.090 "nvme_io": false, 00:13:27.090 "nvme_io_md": false, 00:13:27.090 "write_zeroes": true, 00:13:27.090 "zcopy": true, 00:13:27.090 "get_zone_info": false, 00:13:27.090 "zone_management": false, 00:13:27.090 "zone_append": false, 00:13:27.090 "compare": false, 00:13:27.090 "compare_and_write": false, 00:13:27.090 "abort": true, 00:13:27.090 "seek_hole": false, 00:13:27.090 "seek_data": false, 00:13:27.090 "copy": true, 00:13:27.090 "nvme_iov_md": false 00:13:27.090 }, 00:13:27.090 "memory_domains": [ 00:13:27.090 { 00:13:27.090 "dma_device_id": "system", 00:13:27.090 "dma_device_type": 1 00:13:27.090 }, 00:13:27.090 { 00:13:27.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.090 "dma_device_type": 2 00:13:27.090 } 00:13:27.090 ], 00:13:27.090 "driver_specific": {} 00:13:27.090 }' 00:13:27.090 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.090 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.351 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.351 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.351 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.352 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:27.612 [2024-07-12 15:49:47.944839] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:27.612 15:49:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:27.872 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:27.872 "name": "Existed_Raid", 00:13:27.872 "uuid": "dc21820e-6fc3-4dbc-936c-47a8fde16885", 00:13:27.872 "strip_size_kb": 0, 00:13:27.872 "state": "online", 00:13:27.872 "raid_level": "raid1", 00:13:27.872 "superblock": true, 00:13:27.872 "num_base_bdevs": 2, 00:13:27.872 "num_base_bdevs_discovered": 1, 00:13:27.872 "num_base_bdevs_operational": 1, 00:13:27.872 "base_bdevs_list": [ 00:13:27.872 { 00:13:27.872 "name": null, 00:13:27.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:27.872 "is_configured": false, 00:13:27.872 "data_offset": 2048, 00:13:27.872 "data_size": 63488 00:13:27.872 }, 00:13:27.872 { 00:13:27.872 "name": "BaseBdev2", 00:13:27.872 "uuid": "900e673b-1bab-4242-bcc2-2de4c574ec7b", 00:13:27.872 "is_configured": true, 00:13:27.872 "data_offset": 2048, 00:13:27.872 "data_size": 63488 00:13:27.872 } 00:13:27.872 ] 00:13:27.872 }' 00:13:27.872 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:27.872 15:49:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:28.442 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:28.442 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:28.442 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.442 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:28.703 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:28.703 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:28.703 15:49:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:28.703 [2024-07-12 15:49:49.071704] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:28.703 [2024-07-12 15:49:49.071768] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.703 [2024-07-12 15:49:49.077792] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.703 [2024-07-12 15:49:49.077817] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:28.703 [2024-07-12 15:49:49.077824] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f76e80 name Existed_Raid, state offline 00:13:28.703 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:28.703 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:28.703 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.703 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2514747 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2514747 ']' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2514747 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2514747 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2514747' 00:13:28.963 killing process with pid 2514747 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2514747 00:13:28.963 [2024-07-12 15:49:49.334129] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:28.963 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2514747 00:13:28.963 [2024-07-12 15:49:49.334737] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:29.224 15:49:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:29.224 00:13:29.224 real 0m8.886s 00:13:29.224 user 0m16.170s 00:13:29.224 sys 0m1.361s 00:13:29.224 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:29.224 15:49:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:29.224 ************************************ 00:13:29.224 END TEST raid_state_function_test_sb 00:13:29.224 ************************************ 00:13:29.224 15:49:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:29.224 15:49:49 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:13:29.224 15:49:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:29.224 15:49:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:29.224 15:49:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:29.224 ************************************ 00:13:29.224 START TEST raid_superblock_test 00:13:29.224 ************************************ 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2516507 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2516507 /var/tmp/spdk-raid.sock 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2516507 ']' 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:29.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:29.224 15:49:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.224 [2024-07-12 15:49:49.589143] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:29.224 [2024-07-12 15:49:49.589196] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2516507 ] 00:13:29.484 [2024-07-12 15:49:49.678574] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:29.484 [2024-07-12 15:49:49.745913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:29.484 [2024-07-12 15:49:49.790154] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:29.484 [2024-07-12 15:49:49.790178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:30.054 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:30.313 malloc1 00:13:30.313 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:30.573 [2024-07-12 15:49:50.764556] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:30.573 [2024-07-12 15:49:50.764591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.573 [2024-07-12 15:49:50.764603] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b8eb50 00:13:30.573 [2024-07-12 15:49:50.764614] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.573 [2024-07-12 15:49:50.765931] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.573 [2024-07-12 15:49:50.765950] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:30.573 pt1 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:30.573 malloc2 00:13:30.573 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:30.832 [2024-07-12 15:49:51.147535] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:30.832 [2024-07-12 15:49:51.147568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:30.832 [2024-07-12 15:49:51.147579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b8fdf0 00:13:30.832 [2024-07-12 15:49:51.147585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:30.832 [2024-07-12 15:49:51.148826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:30.832 [2024-07-12 15:49:51.148845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:30.832 pt2 00:13:30.832 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:30.832 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:30.832 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:13:31.091 [2024-07-12 15:49:51.323988] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:31.091 [2024-07-12 15:49:51.324982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:31.091 [2024-07-12 15:49:51.325092] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d350f0 00:13:31.091 [2024-07-12 15:49:51.325100] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:31.091 [2024-07-12 15:49:51.325245] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ba5a40 00:13:31.091 [2024-07-12 15:49:51.325355] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d350f0 00:13:31.091 [2024-07-12 15:49:51.325361] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d350f0 00:13:31.091 [2024-07-12 15:49:51.325430] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.091 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.092 "name": "raid_bdev1", 00:13:31.092 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:31.092 "strip_size_kb": 0, 00:13:31.092 "state": "online", 00:13:31.092 "raid_level": "raid1", 00:13:31.092 "superblock": true, 00:13:31.092 "num_base_bdevs": 2, 00:13:31.092 "num_base_bdevs_discovered": 2, 00:13:31.092 "num_base_bdevs_operational": 2, 00:13:31.092 "base_bdevs_list": [ 00:13:31.092 { 00:13:31.092 "name": "pt1", 00:13:31.092 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:31.092 "is_configured": true, 00:13:31.092 "data_offset": 2048, 00:13:31.092 "data_size": 63488 00:13:31.092 }, 00:13:31.092 { 00:13:31.092 "name": "pt2", 00:13:31.092 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.092 "is_configured": true, 00:13:31.092 "data_offset": 2048, 00:13:31.092 "data_size": 63488 00:13:31.092 } 00:13:31.092 ] 00:13:31.092 }' 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.092 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:31.660 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:31.920 [2024-07-12 15:49:52.226451] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:31.920 "name": "raid_bdev1", 00:13:31.920 "aliases": [ 00:13:31.920 "a34ba41c-94e5-47b3-8786-030c5c9348f7" 00:13:31.920 ], 00:13:31.920 "product_name": "Raid Volume", 00:13:31.920 "block_size": 512, 00:13:31.920 "num_blocks": 63488, 00:13:31.920 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:31.920 "assigned_rate_limits": { 00:13:31.920 "rw_ios_per_sec": 0, 00:13:31.920 "rw_mbytes_per_sec": 0, 00:13:31.920 "r_mbytes_per_sec": 0, 00:13:31.920 "w_mbytes_per_sec": 0 00:13:31.920 }, 00:13:31.920 "claimed": false, 00:13:31.920 "zoned": false, 00:13:31.920 "supported_io_types": { 00:13:31.920 "read": true, 00:13:31.920 "write": true, 00:13:31.920 "unmap": false, 00:13:31.920 "flush": false, 00:13:31.920 "reset": true, 00:13:31.920 "nvme_admin": false, 00:13:31.920 "nvme_io": false, 00:13:31.920 "nvme_io_md": false, 00:13:31.920 "write_zeroes": true, 00:13:31.920 "zcopy": false, 00:13:31.920 "get_zone_info": false, 00:13:31.920 "zone_management": false, 00:13:31.920 "zone_append": false, 00:13:31.920 "compare": false, 00:13:31.920 "compare_and_write": false, 00:13:31.920 "abort": false, 00:13:31.920 "seek_hole": false, 00:13:31.920 "seek_data": false, 00:13:31.920 "copy": false, 00:13:31.920 "nvme_iov_md": false 00:13:31.920 }, 00:13:31.920 "memory_domains": [ 00:13:31.920 { 00:13:31.920 "dma_device_id": "system", 00:13:31.920 "dma_device_type": 1 00:13:31.920 }, 00:13:31.920 { 00:13:31.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.920 "dma_device_type": 2 00:13:31.920 }, 00:13:31.920 { 00:13:31.920 "dma_device_id": "system", 00:13:31.920 "dma_device_type": 1 00:13:31.920 }, 00:13:31.920 { 00:13:31.920 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.920 "dma_device_type": 2 00:13:31.920 } 00:13:31.920 ], 00:13:31.920 "driver_specific": { 00:13:31.920 "raid": { 00:13:31.920 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:31.920 "strip_size_kb": 0, 00:13:31.920 "state": "online", 00:13:31.920 "raid_level": "raid1", 00:13:31.920 "superblock": true, 00:13:31.920 "num_base_bdevs": 2, 00:13:31.920 "num_base_bdevs_discovered": 2, 00:13:31.920 "num_base_bdevs_operational": 2, 00:13:31.920 "base_bdevs_list": [ 00:13:31.920 { 00:13:31.920 "name": "pt1", 00:13:31.920 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:31.920 "is_configured": true, 00:13:31.920 "data_offset": 2048, 00:13:31.920 "data_size": 63488 00:13:31.920 }, 00:13:31.920 { 00:13:31.920 "name": "pt2", 00:13:31.920 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.920 "is_configured": true, 00:13:31.920 "data_offset": 2048, 00:13:31.920 "data_size": 63488 00:13:31.920 } 00:13:31.920 ] 00:13:31.920 } 00:13:31.920 } 00:13:31.920 }' 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:31.920 pt2' 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:31.920 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.180 "name": "pt1", 00:13:32.180 "aliases": [ 00:13:32.180 "00000000-0000-0000-0000-000000000001" 00:13:32.180 ], 00:13:32.180 "product_name": "passthru", 00:13:32.180 "block_size": 512, 00:13:32.180 "num_blocks": 65536, 00:13:32.180 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:32.180 "assigned_rate_limits": { 00:13:32.180 "rw_ios_per_sec": 0, 00:13:32.180 "rw_mbytes_per_sec": 0, 00:13:32.180 "r_mbytes_per_sec": 0, 00:13:32.180 "w_mbytes_per_sec": 0 00:13:32.180 }, 00:13:32.180 "claimed": true, 00:13:32.180 "claim_type": "exclusive_write", 00:13:32.180 "zoned": false, 00:13:32.180 "supported_io_types": { 00:13:32.180 "read": true, 00:13:32.180 "write": true, 00:13:32.180 "unmap": true, 00:13:32.180 "flush": true, 00:13:32.180 "reset": true, 00:13:32.180 "nvme_admin": false, 00:13:32.180 "nvme_io": false, 00:13:32.180 "nvme_io_md": false, 00:13:32.180 "write_zeroes": true, 00:13:32.180 "zcopy": true, 00:13:32.180 "get_zone_info": false, 00:13:32.180 "zone_management": false, 00:13:32.180 "zone_append": false, 00:13:32.180 "compare": false, 00:13:32.180 "compare_and_write": false, 00:13:32.180 "abort": true, 00:13:32.180 "seek_hole": false, 00:13:32.180 "seek_data": false, 00:13:32.180 "copy": true, 00:13:32.180 "nvme_iov_md": false 00:13:32.180 }, 00:13:32.180 "memory_domains": [ 00:13:32.180 { 00:13:32.180 "dma_device_id": "system", 00:13:32.180 "dma_device_type": 1 00:13:32.180 }, 00:13:32.180 { 00:13:32.180 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.180 "dma_device_type": 2 00:13:32.180 } 00:13:32.180 ], 00:13:32.180 "driver_specific": { 00:13:32.180 "passthru": { 00:13:32.180 "name": "pt1", 00:13:32.180 "base_bdev_name": "malloc1" 00:13:32.180 } 00:13:32.180 } 00:13:32.180 }' 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.180 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.439 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.440 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:32.440 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.699 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.699 "name": "pt2", 00:13:32.699 "aliases": [ 00:13:32.699 "00000000-0000-0000-0000-000000000002" 00:13:32.699 ], 00:13:32.699 "product_name": "passthru", 00:13:32.699 "block_size": 512, 00:13:32.699 "num_blocks": 65536, 00:13:32.699 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:32.699 "assigned_rate_limits": { 00:13:32.699 "rw_ios_per_sec": 0, 00:13:32.699 "rw_mbytes_per_sec": 0, 00:13:32.699 "r_mbytes_per_sec": 0, 00:13:32.699 "w_mbytes_per_sec": 0 00:13:32.699 }, 00:13:32.699 "claimed": true, 00:13:32.699 "claim_type": "exclusive_write", 00:13:32.699 "zoned": false, 00:13:32.699 "supported_io_types": { 00:13:32.699 "read": true, 00:13:32.699 "write": true, 00:13:32.699 "unmap": true, 00:13:32.699 "flush": true, 00:13:32.699 "reset": true, 00:13:32.699 "nvme_admin": false, 00:13:32.699 "nvme_io": false, 00:13:32.699 "nvme_io_md": false, 00:13:32.699 "write_zeroes": true, 00:13:32.699 "zcopy": true, 00:13:32.699 "get_zone_info": false, 00:13:32.699 "zone_management": false, 00:13:32.699 "zone_append": false, 00:13:32.699 "compare": false, 00:13:32.699 "compare_and_write": false, 00:13:32.699 "abort": true, 00:13:32.699 "seek_hole": false, 00:13:32.699 "seek_data": false, 00:13:32.699 "copy": true, 00:13:32.699 "nvme_iov_md": false 00:13:32.699 }, 00:13:32.699 "memory_domains": [ 00:13:32.699 { 00:13:32.699 "dma_device_id": "system", 00:13:32.699 "dma_device_type": 1 00:13:32.699 }, 00:13:32.699 { 00:13:32.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.699 "dma_device_type": 2 00:13:32.699 } 00:13:32.699 ], 00:13:32.699 "driver_specific": { 00:13:32.699 "passthru": { 00:13:32.699 "name": "pt2", 00:13:32.699 "base_bdev_name": "malloc2" 00:13:32.699 } 00:13:32.699 } 00:13:32.699 }' 00:13:32.699 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.699 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.699 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.699 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:32.958 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:33.217 [2024-07-12 15:49:53.537756] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:33.217 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=a34ba41c-94e5-47b3-8786-030c5c9348f7 00:13:33.217 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z a34ba41c-94e5-47b3-8786-030c5c9348f7 ']' 00:13:33.217 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:33.476 [2024-07-12 15:49:53.734052] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:33.476 [2024-07-12 15:49:53.734063] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:33.476 [2024-07-12 15:49:53.734101] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:33.477 [2024-07-12 15:49:53.734138] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:33.477 [2024-07-12 15:49:53.734144] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d350f0 name raid_bdev1, state offline 00:13:33.477 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.477 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:33.738 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:33.738 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:33.738 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:33.739 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:33.739 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:33.739 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:34.052 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:34.052 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:13:34.328 [2024-07-12 15:49:54.688445] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:34.328 [2024-07-12 15:49:54.689498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:34.328 [2024-07-12 15:49:54.689539] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:34.328 [2024-07-12 15:49:54.689565] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:34.328 [2024-07-12 15:49:54.689575] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:34.328 [2024-07-12 15:49:54.689580] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d33f30 name raid_bdev1, state configuring 00:13:34.328 request: 00:13:34.328 { 00:13:34.328 "name": "raid_bdev1", 00:13:34.328 "raid_level": "raid1", 00:13:34.328 "base_bdevs": [ 00:13:34.328 "malloc1", 00:13:34.328 "malloc2" 00:13:34.328 ], 00:13:34.328 "superblock": false, 00:13:34.328 "method": "bdev_raid_create", 00:13:34.328 "req_id": 1 00:13:34.328 } 00:13:34.328 Got JSON-RPC error response 00:13:34.328 response: 00:13:34.328 { 00:13:34.328 "code": -17, 00:13:34.328 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:34.328 } 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.328 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:34.587 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:34.587 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:34.587 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:34.846 [2024-07-12 15:49:55.073383] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:34.846 [2024-07-12 15:49:55.073405] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.846 [2024-07-12 15:49:55.073415] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d34920 00:13:34.846 [2024-07-12 15:49:55.073421] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.846 [2024-07-12 15:49:55.074698] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.846 [2024-07-12 15:49:55.074723] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:34.846 [2024-07-12 15:49:55.074765] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:34.846 [2024-07-12 15:49:55.074782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:34.846 pt1 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.846 "name": "raid_bdev1", 00:13:34.846 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:34.846 "strip_size_kb": 0, 00:13:34.846 "state": "configuring", 00:13:34.846 "raid_level": "raid1", 00:13:34.846 "superblock": true, 00:13:34.846 "num_base_bdevs": 2, 00:13:34.846 "num_base_bdevs_discovered": 1, 00:13:34.846 "num_base_bdevs_operational": 2, 00:13:34.846 "base_bdevs_list": [ 00:13:34.846 { 00:13:34.846 "name": "pt1", 00:13:34.846 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:34.846 "is_configured": true, 00:13:34.846 "data_offset": 2048, 00:13:34.846 "data_size": 63488 00:13:34.846 }, 00:13:34.846 { 00:13:34.846 "name": null, 00:13:34.846 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:34.846 "is_configured": false, 00:13:34.846 "data_offset": 2048, 00:13:34.846 "data_size": 63488 00:13:34.846 } 00:13:34.846 ] 00:13:34.846 }' 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.846 15:49:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.415 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:35.415 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:35.415 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:35.415 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:35.674 [2024-07-12 15:49:56.011768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:35.674 [2024-07-12 15:49:56.011802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.674 [2024-07-12 15:49:56.011812] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b8ed80 00:13:35.674 [2024-07-12 15:49:56.011818] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.674 [2024-07-12 15:49:56.012075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.674 [2024-07-12 15:49:56.012085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:35.674 [2024-07-12 15:49:56.012125] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:35.674 [2024-07-12 15:49:56.012138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:35.674 [2024-07-12 15:49:56.012213] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d36f20 00:13:35.674 [2024-07-12 15:49:56.012220] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:35.674 [2024-07-12 15:49:56.012352] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b86190 00:13:35.674 [2024-07-12 15:49:56.012451] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d36f20 00:13:35.674 [2024-07-12 15:49:56.012456] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d36f20 00:13:35.674 [2024-07-12 15:49:56.012525] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:35.674 pt2 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.674 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.934 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.934 "name": "raid_bdev1", 00:13:35.934 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:35.934 "strip_size_kb": 0, 00:13:35.934 "state": "online", 00:13:35.934 "raid_level": "raid1", 00:13:35.934 "superblock": true, 00:13:35.934 "num_base_bdevs": 2, 00:13:35.934 "num_base_bdevs_discovered": 2, 00:13:35.934 "num_base_bdevs_operational": 2, 00:13:35.934 "base_bdevs_list": [ 00:13:35.934 { 00:13:35.934 "name": "pt1", 00:13:35.934 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:35.934 "is_configured": true, 00:13:35.934 "data_offset": 2048, 00:13:35.934 "data_size": 63488 00:13:35.934 }, 00:13:35.934 { 00:13:35.934 "name": "pt2", 00:13:35.934 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:35.934 "is_configured": true, 00:13:35.934 "data_offset": 2048, 00:13:35.934 "data_size": 63488 00:13:35.934 } 00:13:35.934 ] 00:13:35.934 }' 00:13:35.934 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.934 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:36.503 [2024-07-12 15:49:56.922260] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:36.503 "name": "raid_bdev1", 00:13:36.503 "aliases": [ 00:13:36.503 "a34ba41c-94e5-47b3-8786-030c5c9348f7" 00:13:36.503 ], 00:13:36.503 "product_name": "Raid Volume", 00:13:36.503 "block_size": 512, 00:13:36.503 "num_blocks": 63488, 00:13:36.503 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:36.503 "assigned_rate_limits": { 00:13:36.503 "rw_ios_per_sec": 0, 00:13:36.503 "rw_mbytes_per_sec": 0, 00:13:36.503 "r_mbytes_per_sec": 0, 00:13:36.503 "w_mbytes_per_sec": 0 00:13:36.503 }, 00:13:36.503 "claimed": false, 00:13:36.503 "zoned": false, 00:13:36.503 "supported_io_types": { 00:13:36.503 "read": true, 00:13:36.503 "write": true, 00:13:36.503 "unmap": false, 00:13:36.503 "flush": false, 00:13:36.503 "reset": true, 00:13:36.503 "nvme_admin": false, 00:13:36.503 "nvme_io": false, 00:13:36.503 "nvme_io_md": false, 00:13:36.503 "write_zeroes": true, 00:13:36.503 "zcopy": false, 00:13:36.503 "get_zone_info": false, 00:13:36.503 "zone_management": false, 00:13:36.503 "zone_append": false, 00:13:36.503 "compare": false, 00:13:36.503 "compare_and_write": false, 00:13:36.503 "abort": false, 00:13:36.503 "seek_hole": false, 00:13:36.503 "seek_data": false, 00:13:36.503 "copy": false, 00:13:36.503 "nvme_iov_md": false 00:13:36.503 }, 00:13:36.503 "memory_domains": [ 00:13:36.503 { 00:13:36.503 "dma_device_id": "system", 00:13:36.503 "dma_device_type": 1 00:13:36.503 }, 00:13:36.503 { 00:13:36.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.503 "dma_device_type": 2 00:13:36.503 }, 00:13:36.503 { 00:13:36.503 "dma_device_id": "system", 00:13:36.503 "dma_device_type": 1 00:13:36.503 }, 00:13:36.503 { 00:13:36.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.503 "dma_device_type": 2 00:13:36.503 } 00:13:36.503 ], 00:13:36.503 "driver_specific": { 00:13:36.503 "raid": { 00:13:36.503 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:36.503 "strip_size_kb": 0, 00:13:36.503 "state": "online", 00:13:36.503 "raid_level": "raid1", 00:13:36.503 "superblock": true, 00:13:36.503 "num_base_bdevs": 2, 00:13:36.503 "num_base_bdevs_discovered": 2, 00:13:36.503 "num_base_bdevs_operational": 2, 00:13:36.503 "base_bdevs_list": [ 00:13:36.503 { 00:13:36.503 "name": "pt1", 00:13:36.503 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:36.503 "is_configured": true, 00:13:36.503 "data_offset": 2048, 00:13:36.503 "data_size": 63488 00:13:36.503 }, 00:13:36.503 { 00:13:36.503 "name": "pt2", 00:13:36.503 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:36.503 "is_configured": true, 00:13:36.503 "data_offset": 2048, 00:13:36.503 "data_size": 63488 00:13:36.503 } 00:13:36.503 ] 00:13:36.503 } 00:13:36.503 } 00:13:36.503 }' 00:13:36.503 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:36.763 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:36.763 pt2' 00:13:36.763 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:36.763 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:36.763 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:36.763 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:36.763 "name": "pt1", 00:13:36.763 "aliases": [ 00:13:36.763 "00000000-0000-0000-0000-000000000001" 00:13:36.763 ], 00:13:36.763 "product_name": "passthru", 00:13:36.763 "block_size": 512, 00:13:36.763 "num_blocks": 65536, 00:13:36.763 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:36.763 "assigned_rate_limits": { 00:13:36.763 "rw_ios_per_sec": 0, 00:13:36.763 "rw_mbytes_per_sec": 0, 00:13:36.763 "r_mbytes_per_sec": 0, 00:13:36.763 "w_mbytes_per_sec": 0 00:13:36.763 }, 00:13:36.763 "claimed": true, 00:13:36.763 "claim_type": "exclusive_write", 00:13:36.763 "zoned": false, 00:13:36.763 "supported_io_types": { 00:13:36.763 "read": true, 00:13:36.763 "write": true, 00:13:36.763 "unmap": true, 00:13:36.763 "flush": true, 00:13:36.763 "reset": true, 00:13:36.763 "nvme_admin": false, 00:13:36.763 "nvme_io": false, 00:13:36.763 "nvme_io_md": false, 00:13:36.763 "write_zeroes": true, 00:13:36.763 "zcopy": true, 00:13:36.763 "get_zone_info": false, 00:13:36.763 "zone_management": false, 00:13:36.763 "zone_append": false, 00:13:36.763 "compare": false, 00:13:36.763 "compare_and_write": false, 00:13:36.763 "abort": true, 00:13:36.763 "seek_hole": false, 00:13:36.763 "seek_data": false, 00:13:36.763 "copy": true, 00:13:36.763 "nvme_iov_md": false 00:13:36.763 }, 00:13:36.763 "memory_domains": [ 00:13:36.763 { 00:13:36.763 "dma_device_id": "system", 00:13:36.763 "dma_device_type": 1 00:13:36.763 }, 00:13:36.763 { 00:13:36.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.763 "dma_device_type": 2 00:13:36.763 } 00:13:36.763 ], 00:13:36.763 "driver_specific": { 00:13:36.763 "passthru": { 00:13:36.763 "name": "pt1", 00:13:36.763 "base_bdev_name": "malloc1" 00:13:36.763 } 00:13:36.763 } 00:13:36.763 }' 00:13:36.763 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.022 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.022 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.022 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.022 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.023 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.023 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.023 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.023 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.023 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:37.283 "name": "pt2", 00:13:37.283 "aliases": [ 00:13:37.283 "00000000-0000-0000-0000-000000000002" 00:13:37.283 ], 00:13:37.283 "product_name": "passthru", 00:13:37.283 "block_size": 512, 00:13:37.283 "num_blocks": 65536, 00:13:37.283 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:37.283 "assigned_rate_limits": { 00:13:37.283 "rw_ios_per_sec": 0, 00:13:37.283 "rw_mbytes_per_sec": 0, 00:13:37.283 "r_mbytes_per_sec": 0, 00:13:37.283 "w_mbytes_per_sec": 0 00:13:37.283 }, 00:13:37.283 "claimed": true, 00:13:37.283 "claim_type": "exclusive_write", 00:13:37.283 "zoned": false, 00:13:37.283 "supported_io_types": { 00:13:37.283 "read": true, 00:13:37.283 "write": true, 00:13:37.283 "unmap": true, 00:13:37.283 "flush": true, 00:13:37.283 "reset": true, 00:13:37.283 "nvme_admin": false, 00:13:37.283 "nvme_io": false, 00:13:37.283 "nvme_io_md": false, 00:13:37.283 "write_zeroes": true, 00:13:37.283 "zcopy": true, 00:13:37.283 "get_zone_info": false, 00:13:37.283 "zone_management": false, 00:13:37.283 "zone_append": false, 00:13:37.283 "compare": false, 00:13:37.283 "compare_and_write": false, 00:13:37.283 "abort": true, 00:13:37.283 "seek_hole": false, 00:13:37.283 "seek_data": false, 00:13:37.283 "copy": true, 00:13:37.283 "nvme_iov_md": false 00:13:37.283 }, 00:13:37.283 "memory_domains": [ 00:13:37.283 { 00:13:37.283 "dma_device_id": "system", 00:13:37.283 "dma_device_type": 1 00:13:37.283 }, 00:13:37.283 { 00:13:37.283 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:37.283 "dma_device_type": 2 00:13:37.283 } 00:13:37.283 ], 00:13:37.283 "driver_specific": { 00:13:37.283 "passthru": { 00:13:37.283 "name": "pt2", 00:13:37.283 "base_bdev_name": "malloc2" 00:13:37.283 } 00:13:37.283 } 00:13:37.283 }' 00:13:37.283 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:37.543 15:49:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:37.802 [2024-07-12 15:49:58.209550] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' a34ba41c-94e5-47b3-8786-030c5c9348f7 '!=' a34ba41c-94e5-47b3-8786-030c5c9348f7 ']' 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:37.802 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:38.062 [2024-07-12 15:49:58.401859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.062 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:38.321 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.321 "name": "raid_bdev1", 00:13:38.321 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:38.321 "strip_size_kb": 0, 00:13:38.321 "state": "online", 00:13:38.321 "raid_level": "raid1", 00:13:38.321 "superblock": true, 00:13:38.321 "num_base_bdevs": 2, 00:13:38.321 "num_base_bdevs_discovered": 1, 00:13:38.321 "num_base_bdevs_operational": 1, 00:13:38.321 "base_bdevs_list": [ 00:13:38.321 { 00:13:38.321 "name": null, 00:13:38.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.321 "is_configured": false, 00:13:38.321 "data_offset": 2048, 00:13:38.321 "data_size": 63488 00:13:38.321 }, 00:13:38.321 { 00:13:38.321 "name": "pt2", 00:13:38.321 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:38.321 "is_configured": true, 00:13:38.321 "data_offset": 2048, 00:13:38.321 "data_size": 63488 00:13:38.321 } 00:13:38.321 ] 00:13:38.321 }' 00:13:38.321 15:49:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.321 15:49:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.890 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:38.890 [2024-07-12 15:49:59.312140] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:38.890 [2024-07-12 15:49:59.312156] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.890 [2024-07-12 15:49:59.312193] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.890 [2024-07-12 15:49:59.312229] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.890 [2024-07-12 15:49:59.312236] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d36f20 name raid_bdev1, state offline 00:13:38.890 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.890 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:13:39.149 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:13:39.149 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:13:39.149 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:13:39.149 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:39.149 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:13:39.408 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:39.668 [2024-07-12 15:49:59.893593] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:39.668 [2024-07-12 15:49:59.893628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:39.668 [2024-07-12 15:49:59.893640] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b862a0 00:13:39.668 [2024-07-12 15:49:59.893647] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:39.668 [2024-07-12 15:49:59.894943] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:39.668 [2024-07-12 15:49:59.894962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:39.668 [2024-07-12 15:49:59.895007] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:39.668 [2024-07-12 15:49:59.895025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:39.668 [2024-07-12 15:49:59.895086] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b85b20 00:13:39.668 [2024-07-12 15:49:59.895092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:39.668 [2024-07-12 15:49:59.895225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b86530 00:13:39.668 [2024-07-12 15:49:59.895319] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b85b20 00:13:39.668 [2024-07-12 15:49:59.895324] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b85b20 00:13:39.668 [2024-07-12 15:49:59.895394] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:39.668 pt2 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:39.668 15:49:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.940 15:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.940 "name": "raid_bdev1", 00:13:39.940 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:39.940 "strip_size_kb": 0, 00:13:39.940 "state": "online", 00:13:39.940 "raid_level": "raid1", 00:13:39.940 "superblock": true, 00:13:39.940 "num_base_bdevs": 2, 00:13:39.940 "num_base_bdevs_discovered": 1, 00:13:39.940 "num_base_bdevs_operational": 1, 00:13:39.940 "base_bdevs_list": [ 00:13:39.940 { 00:13:39.940 "name": null, 00:13:39.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.940 "is_configured": false, 00:13:39.940 "data_offset": 2048, 00:13:39.940 "data_size": 63488 00:13:39.940 }, 00:13:39.940 { 00:13:39.940 "name": "pt2", 00:13:39.940 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:39.940 "is_configured": true, 00:13:39.940 "data_offset": 2048, 00:13:39.940 "data_size": 63488 00:13:39.940 } 00:13:39.940 ] 00:13:39.940 }' 00:13:39.940 15:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.940 15:50:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.199 15:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:40.459 [2024-07-12 15:50:00.824099] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:40.459 [2024-07-12 15:50:00.824114] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:40.459 [2024-07-12 15:50:00.824151] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:40.459 [2024-07-12 15:50:00.824182] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:40.459 [2024-07-12 15:50:00.824188] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b85b20 name raid_bdev1, state offline 00:13:40.459 15:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.459 15:50:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:13:40.718 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:13:40.719 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:13:40.719 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:13:40.719 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:40.979 [2024-07-12 15:50:01.209059] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:40.979 [2024-07-12 15:50:01.209086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:40.979 [2024-07-12 15:50:01.209095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b85da0 00:13:40.979 [2024-07-12 15:50:01.209102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:40.979 [2024-07-12 15:50:01.210375] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:40.979 [2024-07-12 15:50:01.210394] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:40.979 [2024-07-12 15:50:01.210439] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:40.979 [2024-07-12 15:50:01.210456] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:40.979 [2024-07-12 15:50:01.210528] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:40.979 [2024-07-12 15:50:01.210536] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:40.979 [2024-07-12 15:50:01.210548] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d37a70 name raid_bdev1, state configuring 00:13:40.979 [2024-07-12 15:50:01.210561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:40.979 [2024-07-12 15:50:01.210600] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d36670 00:13:40.979 [2024-07-12 15:50:01.210605] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:40.979 [2024-07-12 15:50:01.210746] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b8e820 00:13:40.979 [2024-07-12 15:50:01.210842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d36670 00:13:40.979 [2024-07-12 15:50:01.210847] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d36670 00:13:40.979 [2024-07-12 15:50:01.210917] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.979 pt1 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.979 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:41.238 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.238 "name": "raid_bdev1", 00:13:41.238 "uuid": "a34ba41c-94e5-47b3-8786-030c5c9348f7", 00:13:41.238 "strip_size_kb": 0, 00:13:41.238 "state": "online", 00:13:41.238 "raid_level": "raid1", 00:13:41.238 "superblock": true, 00:13:41.238 "num_base_bdevs": 2, 00:13:41.238 "num_base_bdevs_discovered": 1, 00:13:41.238 "num_base_bdevs_operational": 1, 00:13:41.238 "base_bdevs_list": [ 00:13:41.238 { 00:13:41.238 "name": null, 00:13:41.238 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.238 "is_configured": false, 00:13:41.238 "data_offset": 2048, 00:13:41.238 "data_size": 63488 00:13:41.238 }, 00:13:41.238 { 00:13:41.238 "name": "pt2", 00:13:41.238 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:41.238 "is_configured": true, 00:13:41.238 "data_offset": 2048, 00:13:41.238 "data_size": 63488 00:13:41.238 } 00:13:41.238 ] 00:13:41.238 }' 00:13:41.238 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.238 15:50:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.808 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:41.808 15:50:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:41.808 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:13:41.808 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:41.808 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:13:42.068 [2024-07-12 15:50:02.364159] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' a34ba41c-94e5-47b3-8786-030c5c9348f7 '!=' a34ba41c-94e5-47b3-8786-030c5c9348f7 ']' 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2516507 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2516507 ']' 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2516507 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2516507 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2516507' 00:13:42.068 killing process with pid 2516507 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2516507 00:13:42.068 [2024-07-12 15:50:02.430088] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:42.068 [2024-07-12 15:50:02.430126] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:42.068 [2024-07-12 15:50:02.430155] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:42.068 [2024-07-12 15:50:02.430161] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d36670 name raid_bdev1, state offline 00:13:42.068 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2516507 00:13:42.068 [2024-07-12 15:50:02.439458] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:42.328 15:50:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:42.328 00:13:42.328 real 0m13.028s 00:13:42.328 user 0m24.219s 00:13:42.328 sys 0m1.917s 00:13:42.328 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:42.328 15:50:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.328 ************************************ 00:13:42.328 END TEST raid_superblock_test 00:13:42.328 ************************************ 00:13:42.328 15:50:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:42.328 15:50:02 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:42.328 15:50:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:42.328 15:50:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:42.328 15:50:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:42.328 ************************************ 00:13:42.328 START TEST raid_read_error_test 00:13:42.328 ************************************ 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.YZmveTjWfg 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2518984 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2518984 /var/tmp/spdk-raid.sock 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2518984 ']' 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:42.328 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.328 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:42.328 [2024-07-12 15:50:02.698526] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:42.328 [2024-07-12 15:50:02.698579] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2518984 ] 00:13:42.588 [2024-07-12 15:50:02.789445] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:42.588 [2024-07-12 15:50:02.857033] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:42.588 [2024-07-12 15:50:02.906575] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:42.588 [2024-07-12 15:50:02.906605] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:43.527 15:50:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:43.527 15:50:03 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:43.527 15:50:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:43.527 15:50:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:44.097 BaseBdev1_malloc 00:13:44.097 15:50:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:44.666 true 00:13:44.666 15:50:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:45.236 [2024-07-12 15:50:05.477207] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:45.236 [2024-07-12 15:50:05.477240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:45.236 [2024-07-12 15:50:05.477252] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10dcaa0 00:13:45.236 [2024-07-12 15:50:05.477258] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:45.236 [2024-07-12 15:50:05.478632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:45.236 [2024-07-12 15:50:05.478656] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:45.236 BaseBdev1 00:13:45.236 15:50:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:45.236 15:50:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:45.833 BaseBdev2_malloc 00:13:45.833 15:50:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:46.402 true 00:13:46.402 15:50:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:46.663 [2024-07-12 15:50:07.103130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:46.663 [2024-07-12 15:50:07.103158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:46.663 [2024-07-12 15:50:07.103170] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10e1e40 00:13:46.663 [2024-07-12 15:50:07.103176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:46.663 [2024-07-12 15:50:07.104377] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:46.663 [2024-07-12 15:50:07.104396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:46.663 BaseBdev2 00:13:46.923 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:47.493 [2024-07-12 15:50:07.644489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:47.493 [2024-07-12 15:50:07.645506] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:47.493 [2024-07-12 15:50:07.645650] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10e3000 00:13:47.493 [2024-07-12 15:50:07.645659] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:47.493 [2024-07-12 15:50:07.645813] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf39ef0 00:13:47.493 [2024-07-12 15:50:07.645930] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10e3000 00:13:47.493 [2024-07-12 15:50:07.645935] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10e3000 00:13:47.493 [2024-07-12 15:50:07.646011] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.493 15:50:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:48.062 15:50:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.062 "name": "raid_bdev1", 00:13:48.062 "uuid": "ed99b368-8c87-47bf-b255-89f17d6a2452", 00:13:48.062 "strip_size_kb": 0, 00:13:48.062 "state": "online", 00:13:48.062 "raid_level": "raid1", 00:13:48.062 "superblock": true, 00:13:48.062 "num_base_bdevs": 2, 00:13:48.062 "num_base_bdevs_discovered": 2, 00:13:48.062 "num_base_bdevs_operational": 2, 00:13:48.062 "base_bdevs_list": [ 00:13:48.062 { 00:13:48.062 "name": "BaseBdev1", 00:13:48.062 "uuid": "f5fbef3a-5534-528c-81ad-7581bbec9c38", 00:13:48.062 "is_configured": true, 00:13:48.062 "data_offset": 2048, 00:13:48.062 "data_size": 63488 00:13:48.062 }, 00:13:48.062 { 00:13:48.062 "name": "BaseBdev2", 00:13:48.062 "uuid": "f24266ff-c6c5-5fb8-a80b-f7f6d2772936", 00:13:48.062 "is_configured": true, 00:13:48.062 "data_offset": 2048, 00:13:48.062 "data_size": 63488 00:13:48.062 } 00:13:48.062 ] 00:13:48.062 }' 00:13:48.062 15:50:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.062 15:50:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:49.001 15:50:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:49.001 15:50:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:49.001 [2024-07-12 15:50:09.192646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf379f0 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.941 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:50.201 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.201 "name": "raid_bdev1", 00:13:50.201 "uuid": "ed99b368-8c87-47bf-b255-89f17d6a2452", 00:13:50.201 "strip_size_kb": 0, 00:13:50.201 "state": "online", 00:13:50.201 "raid_level": "raid1", 00:13:50.201 "superblock": true, 00:13:50.201 "num_base_bdevs": 2, 00:13:50.201 "num_base_bdevs_discovered": 2, 00:13:50.201 "num_base_bdevs_operational": 2, 00:13:50.201 "base_bdevs_list": [ 00:13:50.201 { 00:13:50.201 "name": "BaseBdev1", 00:13:50.201 "uuid": "f5fbef3a-5534-528c-81ad-7581bbec9c38", 00:13:50.201 "is_configured": true, 00:13:50.201 "data_offset": 2048, 00:13:50.201 "data_size": 63488 00:13:50.201 }, 00:13:50.201 { 00:13:50.201 "name": "BaseBdev2", 00:13:50.201 "uuid": "f24266ff-c6c5-5fb8-a80b-f7f6d2772936", 00:13:50.201 "is_configured": true, 00:13:50.201 "data_offset": 2048, 00:13:50.201 "data_size": 63488 00:13:50.201 } 00:13:50.201 ] 00:13:50.201 }' 00:13:50.201 15:50:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.201 15:50:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.771 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:50.771 [2024-07-12 15:50:11.216689] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:50.771 [2024-07-12 15:50:11.216726] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:51.030 [2024-07-12 15:50:11.219311] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:51.030 [2024-07-12 15:50:11.219333] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:51.030 [2024-07-12 15:50:11.219393] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:51.030 [2024-07-12 15:50:11.219399] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10e3000 name raid_bdev1, state offline 00:13:51.030 0 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2518984 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2518984 ']' 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2518984 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2518984 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2518984' 00:13:51.030 killing process with pid 2518984 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2518984 00:13:51.030 [2024-07-12 15:50:11.282992] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2518984 00:13:51.030 [2024-07-12 15:50:11.288823] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.YZmveTjWfg 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:51.030 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:51.031 00:13:51.031 real 0m8.790s 00:13:51.031 user 0m14.946s 00:13:51.031 sys 0m1.064s 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:51.031 15:50:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.031 ************************************ 00:13:51.031 END TEST raid_read_error_test 00:13:51.031 ************************************ 00:13:51.031 15:50:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:51.031 15:50:11 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:51.031 15:50:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:51.031 15:50:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:51.031 15:50:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:51.291 ************************************ 00:13:51.291 START TEST raid_write_error_test 00:13:51.291 ************************************ 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.YHJhfGOeRF 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2520606 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2520606 /var/tmp/spdk-raid.sock 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2520606 ']' 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:51.291 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:51.291 15:50:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.291 [2024-07-12 15:50:11.560634] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:51.291 [2024-07-12 15:50:11.560678] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2520606 ] 00:13:51.291 [2024-07-12 15:50:11.647384] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:51.291 [2024-07-12 15:50:11.710138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.551 [2024-07-12 15:50:11.757090] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.551 [2024-07-12 15:50:11.757117] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:52.120 15:50:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:52.120 15:50:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:52.120 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:52.120 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:52.381 BaseBdev1_malloc 00:13:52.381 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:52.381 true 00:13:52.381 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:52.642 [2024-07-12 15:50:12.939828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:52.642 [2024-07-12 15:50:12.939857] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:52.642 [2024-07-12 15:50:12.939868] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de2aa0 00:13:52.642 [2024-07-12 15:50:12.939874] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:52.642 [2024-07-12 15:50:12.941107] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:52.642 [2024-07-12 15:50:12.941126] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:52.642 BaseBdev1 00:13:52.642 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:52.642 15:50:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:52.901 BaseBdev2_malloc 00:13:52.901 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:52.901 true 00:13:52.901 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:53.160 [2024-07-12 15:50:13.503208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:53.160 [2024-07-12 15:50:13.503236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:53.160 [2024-07-12 15:50:13.503247] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1de7e40 00:13:53.160 [2024-07-12 15:50:13.503253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:53.160 [2024-07-12 15:50:13.504426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:53.160 [2024-07-12 15:50:13.504443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:53.160 BaseBdev2 00:13:53.160 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:53.452 [2024-07-12 15:50:13.695727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.452 [2024-07-12 15:50:13.696735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:53.452 [2024-07-12 15:50:13.696877] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1de9000 00:13:53.452 [2024-07-12 15:50:13.696885] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:53.452 [2024-07-12 15:50:13.697031] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3fef0 00:13:53.452 [2024-07-12 15:50:13.697150] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1de9000 00:13:53.452 [2024-07-12 15:50:13.697155] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1de9000 00:13:53.452 [2024-07-12 15:50:13.697230] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.452 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:53.750 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.750 "name": "raid_bdev1", 00:13:53.750 "uuid": "3c025b10-adce-4113-8f4c-94e69ddfc220", 00:13:53.750 "strip_size_kb": 0, 00:13:53.750 "state": "online", 00:13:53.750 "raid_level": "raid1", 00:13:53.750 "superblock": true, 00:13:53.750 "num_base_bdevs": 2, 00:13:53.750 "num_base_bdevs_discovered": 2, 00:13:53.750 "num_base_bdevs_operational": 2, 00:13:53.750 "base_bdevs_list": [ 00:13:53.750 { 00:13:53.750 "name": "BaseBdev1", 00:13:53.750 "uuid": "ab127eb8-ad41-5f76-996d-ef30ee33c465", 00:13:53.750 "is_configured": true, 00:13:53.750 "data_offset": 2048, 00:13:53.750 "data_size": 63488 00:13:53.750 }, 00:13:53.750 { 00:13:53.750 "name": "BaseBdev2", 00:13:53.750 "uuid": "dcb5cf47-5b75-5958-8059-3bf586117f55", 00:13:53.750 "is_configured": true, 00:13:53.750 "data_offset": 2048, 00:13:53.750 "data_size": 63488 00:13:53.750 } 00:13:53.750 ] 00:13:53.750 }' 00:13:53.750 15:50:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.750 15:50:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.016 15:50:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:54.016 15:50:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:54.276 [2024-07-12 15:50:14.550160] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3d9f0 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:55.214 [2024-07-12 15:50:15.639034] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:55.214 [2024-07-12 15:50:15.639074] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:55.214 [2024-07-12 15:50:15.639234] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1c3d9f0 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.214 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.474 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.474 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:55.474 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.474 "name": "raid_bdev1", 00:13:55.474 "uuid": "3c025b10-adce-4113-8f4c-94e69ddfc220", 00:13:55.474 "strip_size_kb": 0, 00:13:55.474 "state": "online", 00:13:55.474 "raid_level": "raid1", 00:13:55.474 "superblock": true, 00:13:55.474 "num_base_bdevs": 2, 00:13:55.474 "num_base_bdevs_discovered": 1, 00:13:55.474 "num_base_bdevs_operational": 1, 00:13:55.474 "base_bdevs_list": [ 00:13:55.474 { 00:13:55.474 "name": null, 00:13:55.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.474 "is_configured": false, 00:13:55.474 "data_offset": 2048, 00:13:55.474 "data_size": 63488 00:13:55.474 }, 00:13:55.474 { 00:13:55.474 "name": "BaseBdev2", 00:13:55.474 "uuid": "dcb5cf47-5b75-5958-8059-3bf586117f55", 00:13:55.474 "is_configured": true, 00:13:55.474 "data_offset": 2048, 00:13:55.474 "data_size": 63488 00:13:55.474 } 00:13:55.474 ] 00:13:55.474 }' 00:13:55.474 15:50:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.474 15:50:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.043 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:56.303 [2024-07-12 15:50:16.552618] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:56.303 [2024-07-12 15:50:16.552643] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:56.303 [2024-07-12 15:50:16.555197] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:56.303 [2024-07-12 15:50:16.555216] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:56.303 [2024-07-12 15:50:16.555256] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:56.303 [2024-07-12 15:50:16.555262] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1de9000 name raid_bdev1, state offline 00:13:56.303 0 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2520606 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2520606 ']' 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2520606 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2520606 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2520606' 00:13:56.303 killing process with pid 2520606 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2520606 00:13:56.303 [2024-07-12 15:50:16.638389] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:56.303 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2520606 00:13:56.303 [2024-07-12 15:50:16.643664] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.YHJhfGOeRF 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:56.563 00:13:56.563 real 0m5.280s 00:13:56.563 user 0m8.289s 00:13:56.563 sys 0m0.729s 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:56.563 15:50:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.563 ************************************ 00:13:56.563 END TEST raid_write_error_test 00:13:56.563 ************************************ 00:13:56.563 15:50:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:56.563 15:50:16 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:13:56.563 15:50:16 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:56.563 15:50:16 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:56.563 15:50:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:56.563 15:50:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:56.563 15:50:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:56.563 ************************************ 00:13:56.563 START TEST raid_state_function_test 00:13:56.563 ************************************ 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:56.563 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2521621 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2521621' 00:13:56.564 Process raid pid: 2521621 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2521621 /var/tmp/spdk-raid.sock 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2521621 ']' 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:56.564 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:56.564 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.564 [2024-07-12 15:50:16.924225] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:13:56.564 [2024-07-12 15:50:16.924283] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:56.823 [2024-07-12 15:50:17.012524] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:56.823 [2024-07-12 15:50:17.075426] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:56.823 [2024-07-12 15:50:17.113353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:56.823 [2024-07-12 15:50:17.113374] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:57.393 15:50:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:57.393 15:50:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:57.393 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:57.653 [2024-07-12 15:50:17.924234] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:57.653 [2024-07-12 15:50:17.924262] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:57.653 [2024-07-12 15:50:17.924268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:57.653 [2024-07-12 15:50:17.924274] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:57.653 [2024-07-12 15:50:17.924279] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:57.653 [2024-07-12 15:50:17.924284] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.653 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.913 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.913 "name": "Existed_Raid", 00:13:57.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.913 "strip_size_kb": 64, 00:13:57.913 "state": "configuring", 00:13:57.913 "raid_level": "raid0", 00:13:57.913 "superblock": false, 00:13:57.913 "num_base_bdevs": 3, 00:13:57.913 "num_base_bdevs_discovered": 0, 00:13:57.913 "num_base_bdevs_operational": 3, 00:13:57.913 "base_bdevs_list": [ 00:13:57.913 { 00:13:57.913 "name": "BaseBdev1", 00:13:57.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.913 "is_configured": false, 00:13:57.913 "data_offset": 0, 00:13:57.913 "data_size": 0 00:13:57.913 }, 00:13:57.913 { 00:13:57.913 "name": "BaseBdev2", 00:13:57.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.913 "is_configured": false, 00:13:57.913 "data_offset": 0, 00:13:57.913 "data_size": 0 00:13:57.913 }, 00:13:57.913 { 00:13:57.913 "name": "BaseBdev3", 00:13:57.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.913 "is_configured": false, 00:13:57.913 "data_offset": 0, 00:13:57.913 "data_size": 0 00:13:57.913 } 00:13:57.913 ] 00:13:57.913 }' 00:13:57.913 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.913 15:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:58.481 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:58.481 [2024-07-12 15:50:18.866500] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:58.481 [2024-07-12 15:50:18.866517] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ea900 name Existed_Raid, state configuring 00:13:58.481 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:58.741 [2024-07-12 15:50:19.063017] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:58.741 [2024-07-12 15:50:19.063036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:58.741 [2024-07-12 15:50:19.063041] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:58.741 [2024-07-12 15:50:19.063046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:58.741 [2024-07-12 15:50:19.063051] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:58.741 [2024-07-12 15:50:19.063056] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:58.741 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:59.002 [2024-07-12 15:50:19.266064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:59.002 BaseBdev1 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:59.002 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:59.262 [ 00:13:59.262 { 00:13:59.262 "name": "BaseBdev1", 00:13:59.262 "aliases": [ 00:13:59.262 "7b772e60-f661-4199-8f93-7429bfcd87f1" 00:13:59.262 ], 00:13:59.262 "product_name": "Malloc disk", 00:13:59.262 "block_size": 512, 00:13:59.262 "num_blocks": 65536, 00:13:59.262 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:13:59.262 "assigned_rate_limits": { 00:13:59.262 "rw_ios_per_sec": 0, 00:13:59.262 "rw_mbytes_per_sec": 0, 00:13:59.262 "r_mbytes_per_sec": 0, 00:13:59.262 "w_mbytes_per_sec": 0 00:13:59.262 }, 00:13:59.262 "claimed": true, 00:13:59.262 "claim_type": "exclusive_write", 00:13:59.262 "zoned": false, 00:13:59.262 "supported_io_types": { 00:13:59.262 "read": true, 00:13:59.262 "write": true, 00:13:59.262 "unmap": true, 00:13:59.262 "flush": true, 00:13:59.262 "reset": true, 00:13:59.262 "nvme_admin": false, 00:13:59.262 "nvme_io": false, 00:13:59.262 "nvme_io_md": false, 00:13:59.262 "write_zeroes": true, 00:13:59.262 "zcopy": true, 00:13:59.262 "get_zone_info": false, 00:13:59.262 "zone_management": false, 00:13:59.262 "zone_append": false, 00:13:59.262 "compare": false, 00:13:59.262 "compare_and_write": false, 00:13:59.262 "abort": true, 00:13:59.262 "seek_hole": false, 00:13:59.262 "seek_data": false, 00:13:59.262 "copy": true, 00:13:59.262 "nvme_iov_md": false 00:13:59.262 }, 00:13:59.262 "memory_domains": [ 00:13:59.262 { 00:13:59.262 "dma_device_id": "system", 00:13:59.262 "dma_device_type": 1 00:13:59.262 }, 00:13:59.262 { 00:13:59.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.262 "dma_device_type": 2 00:13:59.262 } 00:13:59.262 ], 00:13:59.262 "driver_specific": {} 00:13:59.262 } 00:13:59.262 ] 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.262 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.522 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.522 "name": "Existed_Raid", 00:13:59.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.522 "strip_size_kb": 64, 00:13:59.522 "state": "configuring", 00:13:59.522 "raid_level": "raid0", 00:13:59.522 "superblock": false, 00:13:59.522 "num_base_bdevs": 3, 00:13:59.522 "num_base_bdevs_discovered": 1, 00:13:59.522 "num_base_bdevs_operational": 3, 00:13:59.522 "base_bdevs_list": [ 00:13:59.522 { 00:13:59.522 "name": "BaseBdev1", 00:13:59.522 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:13:59.522 "is_configured": true, 00:13:59.522 "data_offset": 0, 00:13:59.522 "data_size": 65536 00:13:59.522 }, 00:13:59.522 { 00:13:59.522 "name": "BaseBdev2", 00:13:59.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.522 "is_configured": false, 00:13:59.522 "data_offset": 0, 00:13:59.522 "data_size": 0 00:13:59.522 }, 00:13:59.522 { 00:13:59.522 "name": "BaseBdev3", 00:13:59.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.522 "is_configured": false, 00:13:59.522 "data_offset": 0, 00:13:59.522 "data_size": 0 00:13:59.522 } 00:13:59.522 ] 00:13:59.522 }' 00:13:59.522 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.522 15:50:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:00.091 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:00.352 [2024-07-12 15:50:20.573355] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:00.352 [2024-07-12 15:50:20.573385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ea190 name Existed_Raid, state configuring 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:00.352 [2024-07-12 15:50:20.757848] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:00.352 [2024-07-12 15:50:20.758937] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:00.352 [2024-07-12 15:50:20.758959] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:00.352 [2024-07-12 15:50:20.758965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:00.352 [2024-07-12 15:50:20.758971] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.352 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.612 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.612 "name": "Existed_Raid", 00:14:00.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.612 "strip_size_kb": 64, 00:14:00.612 "state": "configuring", 00:14:00.612 "raid_level": "raid0", 00:14:00.612 "superblock": false, 00:14:00.612 "num_base_bdevs": 3, 00:14:00.612 "num_base_bdevs_discovered": 1, 00:14:00.612 "num_base_bdevs_operational": 3, 00:14:00.612 "base_bdevs_list": [ 00:14:00.612 { 00:14:00.612 "name": "BaseBdev1", 00:14:00.612 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:14:00.612 "is_configured": true, 00:14:00.612 "data_offset": 0, 00:14:00.612 "data_size": 65536 00:14:00.612 }, 00:14:00.612 { 00:14:00.612 "name": "BaseBdev2", 00:14:00.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.612 "is_configured": false, 00:14:00.612 "data_offset": 0, 00:14:00.612 "data_size": 0 00:14:00.612 }, 00:14:00.612 { 00:14:00.612 "name": "BaseBdev3", 00:14:00.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:00.612 "is_configured": false, 00:14:00.612 "data_offset": 0, 00:14:00.612 "data_size": 0 00:14:00.612 } 00:14:00.612 ] 00:14:00.612 }' 00:14:00.612 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.612 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:01.182 15:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:01.442 [2024-07-12 15:50:21.688989] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:01.442 BaseBdev2 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:01.442 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:01.702 15:50:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:01.702 [ 00:14:01.702 { 00:14:01.702 "name": "BaseBdev2", 00:14:01.702 "aliases": [ 00:14:01.702 "0dfb9a35-6813-49ae-a61a-fc00ba2613df" 00:14:01.702 ], 00:14:01.702 "product_name": "Malloc disk", 00:14:01.702 "block_size": 512, 00:14:01.702 "num_blocks": 65536, 00:14:01.702 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:01.702 "assigned_rate_limits": { 00:14:01.702 "rw_ios_per_sec": 0, 00:14:01.702 "rw_mbytes_per_sec": 0, 00:14:01.702 "r_mbytes_per_sec": 0, 00:14:01.702 "w_mbytes_per_sec": 0 00:14:01.702 }, 00:14:01.702 "claimed": true, 00:14:01.702 "claim_type": "exclusive_write", 00:14:01.702 "zoned": false, 00:14:01.702 "supported_io_types": { 00:14:01.702 "read": true, 00:14:01.702 "write": true, 00:14:01.702 "unmap": true, 00:14:01.702 "flush": true, 00:14:01.702 "reset": true, 00:14:01.702 "nvme_admin": false, 00:14:01.702 "nvme_io": false, 00:14:01.702 "nvme_io_md": false, 00:14:01.702 "write_zeroes": true, 00:14:01.702 "zcopy": true, 00:14:01.702 "get_zone_info": false, 00:14:01.702 "zone_management": false, 00:14:01.702 "zone_append": false, 00:14:01.702 "compare": false, 00:14:01.702 "compare_and_write": false, 00:14:01.702 "abort": true, 00:14:01.702 "seek_hole": false, 00:14:01.702 "seek_data": false, 00:14:01.702 "copy": true, 00:14:01.702 "nvme_iov_md": false 00:14:01.702 }, 00:14:01.702 "memory_domains": [ 00:14:01.702 { 00:14:01.702 "dma_device_id": "system", 00:14:01.702 "dma_device_type": 1 00:14:01.702 }, 00:14:01.702 { 00:14:01.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.702 "dma_device_type": 2 00:14:01.702 } 00:14:01.702 ], 00:14:01.702 "driver_specific": {} 00:14:01.702 } 00:14:01.702 ] 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.702 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.962 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.962 "name": "Existed_Raid", 00:14:01.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.962 "strip_size_kb": 64, 00:14:01.962 "state": "configuring", 00:14:01.962 "raid_level": "raid0", 00:14:01.962 "superblock": false, 00:14:01.962 "num_base_bdevs": 3, 00:14:01.962 "num_base_bdevs_discovered": 2, 00:14:01.962 "num_base_bdevs_operational": 3, 00:14:01.962 "base_bdevs_list": [ 00:14:01.962 { 00:14:01.962 "name": "BaseBdev1", 00:14:01.962 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:14:01.962 "is_configured": true, 00:14:01.962 "data_offset": 0, 00:14:01.962 "data_size": 65536 00:14:01.962 }, 00:14:01.962 { 00:14:01.962 "name": "BaseBdev2", 00:14:01.962 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:01.962 "is_configured": true, 00:14:01.962 "data_offset": 0, 00:14:01.962 "data_size": 65536 00:14:01.962 }, 00:14:01.962 { 00:14:01.962 "name": "BaseBdev3", 00:14:01.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.962 "is_configured": false, 00:14:01.962 "data_offset": 0, 00:14:01.962 "data_size": 0 00:14:01.962 } 00:14:01.962 ] 00:14:01.962 }' 00:14:01.962 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.962 15:50:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:02.532 15:50:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:02.791 [2024-07-12 15:50:22.997084] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:02.791 [2024-07-12 15:50:22.997107] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20eb280 00:14:02.791 [2024-07-12 15:50:22.997112] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:02.791 [2024-07-12 15:50:22.997270] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20ead70 00:14:02.791 [2024-07-12 15:50:22.997360] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20eb280 00:14:02.791 [2024-07-12 15:50:22.997365] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20eb280 00:14:02.791 [2024-07-12 15:50:22.997483] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:02.791 BaseBdev3 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:02.791 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:03.051 [ 00:14:03.051 { 00:14:03.051 "name": "BaseBdev3", 00:14:03.051 "aliases": [ 00:14:03.051 "c2f5eede-ac28-4188-a507-e626c3e47ede" 00:14:03.051 ], 00:14:03.051 "product_name": "Malloc disk", 00:14:03.051 "block_size": 512, 00:14:03.051 "num_blocks": 65536, 00:14:03.051 "uuid": "c2f5eede-ac28-4188-a507-e626c3e47ede", 00:14:03.051 "assigned_rate_limits": { 00:14:03.051 "rw_ios_per_sec": 0, 00:14:03.051 "rw_mbytes_per_sec": 0, 00:14:03.051 "r_mbytes_per_sec": 0, 00:14:03.051 "w_mbytes_per_sec": 0 00:14:03.051 }, 00:14:03.051 "claimed": true, 00:14:03.051 "claim_type": "exclusive_write", 00:14:03.051 "zoned": false, 00:14:03.051 "supported_io_types": { 00:14:03.051 "read": true, 00:14:03.051 "write": true, 00:14:03.051 "unmap": true, 00:14:03.051 "flush": true, 00:14:03.051 "reset": true, 00:14:03.051 "nvme_admin": false, 00:14:03.051 "nvme_io": false, 00:14:03.051 "nvme_io_md": false, 00:14:03.051 "write_zeroes": true, 00:14:03.051 "zcopy": true, 00:14:03.051 "get_zone_info": false, 00:14:03.051 "zone_management": false, 00:14:03.051 "zone_append": false, 00:14:03.051 "compare": false, 00:14:03.051 "compare_and_write": false, 00:14:03.051 "abort": true, 00:14:03.051 "seek_hole": false, 00:14:03.051 "seek_data": false, 00:14:03.051 "copy": true, 00:14:03.051 "nvme_iov_md": false 00:14:03.051 }, 00:14:03.051 "memory_domains": [ 00:14:03.051 { 00:14:03.051 "dma_device_id": "system", 00:14:03.051 "dma_device_type": 1 00:14:03.051 }, 00:14:03.051 { 00:14:03.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.051 "dma_device_type": 2 00:14:03.051 } 00:14:03.051 ], 00:14:03.052 "driver_specific": {} 00:14:03.052 } 00:14:03.052 ] 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.052 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.312 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.312 "name": "Existed_Raid", 00:14:03.312 "uuid": "57d9e520-e410-48de-89a4-a24d0ae6c402", 00:14:03.312 "strip_size_kb": 64, 00:14:03.312 "state": "online", 00:14:03.312 "raid_level": "raid0", 00:14:03.312 "superblock": false, 00:14:03.312 "num_base_bdevs": 3, 00:14:03.312 "num_base_bdevs_discovered": 3, 00:14:03.312 "num_base_bdevs_operational": 3, 00:14:03.312 "base_bdevs_list": [ 00:14:03.312 { 00:14:03.312 "name": "BaseBdev1", 00:14:03.312 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:14:03.312 "is_configured": true, 00:14:03.312 "data_offset": 0, 00:14:03.312 "data_size": 65536 00:14:03.312 }, 00:14:03.312 { 00:14:03.312 "name": "BaseBdev2", 00:14:03.312 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:03.312 "is_configured": true, 00:14:03.312 "data_offset": 0, 00:14:03.312 "data_size": 65536 00:14:03.312 }, 00:14:03.312 { 00:14:03.312 "name": "BaseBdev3", 00:14:03.312 "uuid": "c2f5eede-ac28-4188-a507-e626c3e47ede", 00:14:03.312 "is_configured": true, 00:14:03.312 "data_offset": 0, 00:14:03.312 "data_size": 65536 00:14:03.312 } 00:14:03.312 ] 00:14:03.312 }' 00:14:03.312 15:50:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.312 15:50:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:03.882 [2024-07-12 15:50:24.300620] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:03.882 "name": "Existed_Raid", 00:14:03.882 "aliases": [ 00:14:03.882 "57d9e520-e410-48de-89a4-a24d0ae6c402" 00:14:03.882 ], 00:14:03.882 "product_name": "Raid Volume", 00:14:03.882 "block_size": 512, 00:14:03.882 "num_blocks": 196608, 00:14:03.882 "uuid": "57d9e520-e410-48de-89a4-a24d0ae6c402", 00:14:03.882 "assigned_rate_limits": { 00:14:03.882 "rw_ios_per_sec": 0, 00:14:03.882 "rw_mbytes_per_sec": 0, 00:14:03.882 "r_mbytes_per_sec": 0, 00:14:03.882 "w_mbytes_per_sec": 0 00:14:03.882 }, 00:14:03.882 "claimed": false, 00:14:03.882 "zoned": false, 00:14:03.882 "supported_io_types": { 00:14:03.882 "read": true, 00:14:03.882 "write": true, 00:14:03.882 "unmap": true, 00:14:03.882 "flush": true, 00:14:03.882 "reset": true, 00:14:03.882 "nvme_admin": false, 00:14:03.882 "nvme_io": false, 00:14:03.882 "nvme_io_md": false, 00:14:03.882 "write_zeroes": true, 00:14:03.882 "zcopy": false, 00:14:03.882 "get_zone_info": false, 00:14:03.882 "zone_management": false, 00:14:03.882 "zone_append": false, 00:14:03.882 "compare": false, 00:14:03.882 "compare_and_write": false, 00:14:03.882 "abort": false, 00:14:03.882 "seek_hole": false, 00:14:03.882 "seek_data": false, 00:14:03.882 "copy": false, 00:14:03.882 "nvme_iov_md": false 00:14:03.882 }, 00:14:03.882 "memory_domains": [ 00:14:03.882 { 00:14:03.882 "dma_device_id": "system", 00:14:03.882 "dma_device_type": 1 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.882 "dma_device_type": 2 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "dma_device_id": "system", 00:14:03.882 "dma_device_type": 1 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.882 "dma_device_type": 2 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "dma_device_id": "system", 00:14:03.882 "dma_device_type": 1 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:03.882 "dma_device_type": 2 00:14:03.882 } 00:14:03.882 ], 00:14:03.882 "driver_specific": { 00:14:03.882 "raid": { 00:14:03.882 "uuid": "57d9e520-e410-48de-89a4-a24d0ae6c402", 00:14:03.882 "strip_size_kb": 64, 00:14:03.882 "state": "online", 00:14:03.882 "raid_level": "raid0", 00:14:03.882 "superblock": false, 00:14:03.882 "num_base_bdevs": 3, 00:14:03.882 "num_base_bdevs_discovered": 3, 00:14:03.882 "num_base_bdevs_operational": 3, 00:14:03.882 "base_bdevs_list": [ 00:14:03.882 { 00:14:03.882 "name": "BaseBdev1", 00:14:03.882 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:14:03.882 "is_configured": true, 00:14:03.882 "data_offset": 0, 00:14:03.882 "data_size": 65536 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "name": "BaseBdev2", 00:14:03.882 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:03.882 "is_configured": true, 00:14:03.882 "data_offset": 0, 00:14:03.882 "data_size": 65536 00:14:03.882 }, 00:14:03.882 { 00:14:03.882 "name": "BaseBdev3", 00:14:03.882 "uuid": "c2f5eede-ac28-4188-a507-e626c3e47ede", 00:14:03.882 "is_configured": true, 00:14:03.882 "data_offset": 0, 00:14:03.882 "data_size": 65536 00:14:03.882 } 00:14:03.882 ] 00:14:03.882 } 00:14:03.882 } 00:14:03.882 }' 00:14:03.882 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:04.142 BaseBdev2 00:14:04.142 BaseBdev3' 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.142 "name": "BaseBdev1", 00:14:04.142 "aliases": [ 00:14:04.142 "7b772e60-f661-4199-8f93-7429bfcd87f1" 00:14:04.142 ], 00:14:04.142 "product_name": "Malloc disk", 00:14:04.142 "block_size": 512, 00:14:04.142 "num_blocks": 65536, 00:14:04.142 "uuid": "7b772e60-f661-4199-8f93-7429bfcd87f1", 00:14:04.142 "assigned_rate_limits": { 00:14:04.142 "rw_ios_per_sec": 0, 00:14:04.142 "rw_mbytes_per_sec": 0, 00:14:04.142 "r_mbytes_per_sec": 0, 00:14:04.142 "w_mbytes_per_sec": 0 00:14:04.142 }, 00:14:04.142 "claimed": true, 00:14:04.142 "claim_type": "exclusive_write", 00:14:04.142 "zoned": false, 00:14:04.142 "supported_io_types": { 00:14:04.142 "read": true, 00:14:04.142 "write": true, 00:14:04.142 "unmap": true, 00:14:04.142 "flush": true, 00:14:04.142 "reset": true, 00:14:04.142 "nvme_admin": false, 00:14:04.142 "nvme_io": false, 00:14:04.142 "nvme_io_md": false, 00:14:04.142 "write_zeroes": true, 00:14:04.142 "zcopy": true, 00:14:04.142 "get_zone_info": false, 00:14:04.142 "zone_management": false, 00:14:04.142 "zone_append": false, 00:14:04.142 "compare": false, 00:14:04.142 "compare_and_write": false, 00:14:04.142 "abort": true, 00:14:04.142 "seek_hole": false, 00:14:04.142 "seek_data": false, 00:14:04.142 "copy": true, 00:14:04.142 "nvme_iov_md": false 00:14:04.142 }, 00:14:04.142 "memory_domains": [ 00:14:04.142 { 00:14:04.142 "dma_device_id": "system", 00:14:04.142 "dma_device_type": 1 00:14:04.142 }, 00:14:04.142 { 00:14:04.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.142 "dma_device_type": 2 00:14:04.142 } 00:14:04.142 ], 00:14:04.142 "driver_specific": {} 00:14:04.142 }' 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.142 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.402 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:04.661 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:04.661 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:04.661 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:04.661 15:50:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:04.661 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:04.661 "name": "BaseBdev2", 00:14:04.661 "aliases": [ 00:14:04.661 "0dfb9a35-6813-49ae-a61a-fc00ba2613df" 00:14:04.661 ], 00:14:04.661 "product_name": "Malloc disk", 00:14:04.661 "block_size": 512, 00:14:04.661 "num_blocks": 65536, 00:14:04.661 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:04.661 "assigned_rate_limits": { 00:14:04.661 "rw_ios_per_sec": 0, 00:14:04.661 "rw_mbytes_per_sec": 0, 00:14:04.661 "r_mbytes_per_sec": 0, 00:14:04.661 "w_mbytes_per_sec": 0 00:14:04.661 }, 00:14:04.661 "claimed": true, 00:14:04.661 "claim_type": "exclusive_write", 00:14:04.661 "zoned": false, 00:14:04.661 "supported_io_types": { 00:14:04.661 "read": true, 00:14:04.661 "write": true, 00:14:04.661 "unmap": true, 00:14:04.662 "flush": true, 00:14:04.662 "reset": true, 00:14:04.662 "nvme_admin": false, 00:14:04.662 "nvme_io": false, 00:14:04.662 "nvme_io_md": false, 00:14:04.662 "write_zeroes": true, 00:14:04.662 "zcopy": true, 00:14:04.662 "get_zone_info": false, 00:14:04.662 "zone_management": false, 00:14:04.662 "zone_append": false, 00:14:04.662 "compare": false, 00:14:04.662 "compare_and_write": false, 00:14:04.662 "abort": true, 00:14:04.662 "seek_hole": false, 00:14:04.662 "seek_data": false, 00:14:04.662 "copy": true, 00:14:04.662 "nvme_iov_md": false 00:14:04.662 }, 00:14:04.662 "memory_domains": [ 00:14:04.662 { 00:14:04.662 "dma_device_id": "system", 00:14:04.662 "dma_device_type": 1 00:14:04.662 }, 00:14:04.662 { 00:14:04.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.662 "dma_device_type": 2 00:14:04.662 } 00:14:04.662 ], 00:14:04.662 "driver_specific": {} 00:14:04.662 }' 00:14:04.662 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:04.921 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.181 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.181 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:05.181 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:05.181 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:05.181 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:05.441 "name": "BaseBdev3", 00:14:05.441 "aliases": [ 00:14:05.441 "c2f5eede-ac28-4188-a507-e626c3e47ede" 00:14:05.441 ], 00:14:05.441 "product_name": "Malloc disk", 00:14:05.441 "block_size": 512, 00:14:05.441 "num_blocks": 65536, 00:14:05.441 "uuid": "c2f5eede-ac28-4188-a507-e626c3e47ede", 00:14:05.441 "assigned_rate_limits": { 00:14:05.441 "rw_ios_per_sec": 0, 00:14:05.441 "rw_mbytes_per_sec": 0, 00:14:05.441 "r_mbytes_per_sec": 0, 00:14:05.441 "w_mbytes_per_sec": 0 00:14:05.441 }, 00:14:05.441 "claimed": true, 00:14:05.441 "claim_type": "exclusive_write", 00:14:05.441 "zoned": false, 00:14:05.441 "supported_io_types": { 00:14:05.441 "read": true, 00:14:05.441 "write": true, 00:14:05.441 "unmap": true, 00:14:05.441 "flush": true, 00:14:05.441 "reset": true, 00:14:05.441 "nvme_admin": false, 00:14:05.441 "nvme_io": false, 00:14:05.441 "nvme_io_md": false, 00:14:05.441 "write_zeroes": true, 00:14:05.441 "zcopy": true, 00:14:05.441 "get_zone_info": false, 00:14:05.441 "zone_management": false, 00:14:05.441 "zone_append": false, 00:14:05.441 "compare": false, 00:14:05.441 "compare_and_write": false, 00:14:05.441 "abort": true, 00:14:05.441 "seek_hole": false, 00:14:05.441 "seek_data": false, 00:14:05.441 "copy": true, 00:14:05.441 "nvme_iov_md": false 00:14:05.441 }, 00:14:05.441 "memory_domains": [ 00:14:05.441 { 00:14:05.441 "dma_device_id": "system", 00:14:05.441 "dma_device_type": 1 00:14:05.441 }, 00:14:05.441 { 00:14:05.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.441 "dma_device_type": 2 00:14:05.441 } 00:14:05.441 ], 00:14:05.441 "driver_specific": {} 00:14:05.441 }' 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.441 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:05.700 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:05.700 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.700 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:05.700 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:05.700 15:50:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:05.959 [2024-07-12 15:50:26.157107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:05.959 [2024-07-12 15:50:26.157124] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:05.959 [2024-07-12 15:50:26.157155] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.959 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.960 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.960 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.960 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.960 "name": "Existed_Raid", 00:14:05.960 "uuid": "57d9e520-e410-48de-89a4-a24d0ae6c402", 00:14:05.960 "strip_size_kb": 64, 00:14:05.960 "state": "offline", 00:14:05.960 "raid_level": "raid0", 00:14:05.960 "superblock": false, 00:14:05.960 "num_base_bdevs": 3, 00:14:05.960 "num_base_bdevs_discovered": 2, 00:14:05.960 "num_base_bdevs_operational": 2, 00:14:05.960 "base_bdevs_list": [ 00:14:05.960 { 00:14:05.960 "name": null, 00:14:05.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:05.960 "is_configured": false, 00:14:05.960 "data_offset": 0, 00:14:05.960 "data_size": 65536 00:14:05.960 }, 00:14:05.960 { 00:14:05.960 "name": "BaseBdev2", 00:14:05.960 "uuid": "0dfb9a35-6813-49ae-a61a-fc00ba2613df", 00:14:05.960 "is_configured": true, 00:14:05.960 "data_offset": 0, 00:14:05.960 "data_size": 65536 00:14:05.960 }, 00:14:05.960 { 00:14:05.960 "name": "BaseBdev3", 00:14:05.960 "uuid": "c2f5eede-ac28-4188-a507-e626c3e47ede", 00:14:05.960 "is_configured": true, 00:14:05.960 "data_offset": 0, 00:14:05.960 "data_size": 65536 00:14:05.960 } 00:14:05.960 ] 00:14:05.960 }' 00:14:05.960 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.960 15:50:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:06.529 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:06.529 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:06.529 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.529 15:50:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:06.788 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:06.788 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:06.788 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:07.048 [2024-07-12 15:50:27.279948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:07.048 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:07.307 [2024-07-12 15:50:27.666702] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:07.307 [2024-07-12 15:50:27.666734] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20eb280 name Existed_Raid, state offline 00:14:07.307 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:07.307 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:07.307 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.307 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:07.566 15:50:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:07.825 BaseBdev2 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:07.825 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:08.084 [ 00:14:08.084 { 00:14:08.085 "name": "BaseBdev2", 00:14:08.085 "aliases": [ 00:14:08.085 "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb" 00:14:08.085 ], 00:14:08.085 "product_name": "Malloc disk", 00:14:08.085 "block_size": 512, 00:14:08.085 "num_blocks": 65536, 00:14:08.085 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:08.085 "assigned_rate_limits": { 00:14:08.085 "rw_ios_per_sec": 0, 00:14:08.085 "rw_mbytes_per_sec": 0, 00:14:08.085 "r_mbytes_per_sec": 0, 00:14:08.085 "w_mbytes_per_sec": 0 00:14:08.085 }, 00:14:08.085 "claimed": false, 00:14:08.085 "zoned": false, 00:14:08.085 "supported_io_types": { 00:14:08.085 "read": true, 00:14:08.085 "write": true, 00:14:08.085 "unmap": true, 00:14:08.085 "flush": true, 00:14:08.085 "reset": true, 00:14:08.085 "nvme_admin": false, 00:14:08.085 "nvme_io": false, 00:14:08.085 "nvme_io_md": false, 00:14:08.085 "write_zeroes": true, 00:14:08.085 "zcopy": true, 00:14:08.085 "get_zone_info": false, 00:14:08.085 "zone_management": false, 00:14:08.085 "zone_append": false, 00:14:08.085 "compare": false, 00:14:08.085 "compare_and_write": false, 00:14:08.085 "abort": true, 00:14:08.085 "seek_hole": false, 00:14:08.085 "seek_data": false, 00:14:08.085 "copy": true, 00:14:08.085 "nvme_iov_md": false 00:14:08.085 }, 00:14:08.085 "memory_domains": [ 00:14:08.085 { 00:14:08.085 "dma_device_id": "system", 00:14:08.085 "dma_device_type": 1 00:14:08.085 }, 00:14:08.085 { 00:14:08.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.085 "dma_device_type": 2 00:14:08.085 } 00:14:08.085 ], 00:14:08.085 "driver_specific": {} 00:14:08.085 } 00:14:08.085 ] 00:14:08.085 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:08.085 15:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:08.085 15:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:08.085 15:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:08.344 BaseBdev3 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:08.344 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.604 15:50:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:08.604 [ 00:14:08.604 { 00:14:08.604 "name": "BaseBdev3", 00:14:08.604 "aliases": [ 00:14:08.604 "450e393d-3e7c-4c67-a9af-e64aa628e5c3" 00:14:08.604 ], 00:14:08.604 "product_name": "Malloc disk", 00:14:08.604 "block_size": 512, 00:14:08.604 "num_blocks": 65536, 00:14:08.604 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:08.604 "assigned_rate_limits": { 00:14:08.604 "rw_ios_per_sec": 0, 00:14:08.604 "rw_mbytes_per_sec": 0, 00:14:08.604 "r_mbytes_per_sec": 0, 00:14:08.604 "w_mbytes_per_sec": 0 00:14:08.604 }, 00:14:08.604 "claimed": false, 00:14:08.604 "zoned": false, 00:14:08.604 "supported_io_types": { 00:14:08.604 "read": true, 00:14:08.604 "write": true, 00:14:08.604 "unmap": true, 00:14:08.604 "flush": true, 00:14:08.604 "reset": true, 00:14:08.604 "nvme_admin": false, 00:14:08.604 "nvme_io": false, 00:14:08.604 "nvme_io_md": false, 00:14:08.604 "write_zeroes": true, 00:14:08.604 "zcopy": true, 00:14:08.604 "get_zone_info": false, 00:14:08.604 "zone_management": false, 00:14:08.604 "zone_append": false, 00:14:08.604 "compare": false, 00:14:08.604 "compare_and_write": false, 00:14:08.604 "abort": true, 00:14:08.604 "seek_hole": false, 00:14:08.604 "seek_data": false, 00:14:08.604 "copy": true, 00:14:08.604 "nvme_iov_md": false 00:14:08.604 }, 00:14:08.604 "memory_domains": [ 00:14:08.604 { 00:14:08.604 "dma_device_id": "system", 00:14:08.604 "dma_device_type": 1 00:14:08.604 }, 00:14:08.604 { 00:14:08.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:08.604 "dma_device_type": 2 00:14:08.604 } 00:14:08.604 ], 00:14:08.604 "driver_specific": {} 00:14:08.604 } 00:14:08.604 ] 00:14:08.604 15:50:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:08.604 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:08.604 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:08.604 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:08.864 [2024-07-12 15:50:29.182397] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:08.864 [2024-07-12 15:50:29.182424] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:08.864 [2024-07-12 15:50:29.182436] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:08.864 [2024-07-12 15:50:29.183565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:08.864 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.157 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.157 "name": "Existed_Raid", 00:14:09.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.157 "strip_size_kb": 64, 00:14:09.157 "state": "configuring", 00:14:09.157 "raid_level": "raid0", 00:14:09.157 "superblock": false, 00:14:09.157 "num_base_bdevs": 3, 00:14:09.157 "num_base_bdevs_discovered": 2, 00:14:09.157 "num_base_bdevs_operational": 3, 00:14:09.157 "base_bdevs_list": [ 00:14:09.157 { 00:14:09.157 "name": "BaseBdev1", 00:14:09.157 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:09.157 "is_configured": false, 00:14:09.157 "data_offset": 0, 00:14:09.157 "data_size": 0 00:14:09.157 }, 00:14:09.157 { 00:14:09.157 "name": "BaseBdev2", 00:14:09.157 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:09.157 "is_configured": true, 00:14:09.157 "data_offset": 0, 00:14:09.157 "data_size": 65536 00:14:09.157 }, 00:14:09.157 { 00:14:09.157 "name": "BaseBdev3", 00:14:09.157 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:09.157 "is_configured": true, 00:14:09.157 "data_offset": 0, 00:14:09.157 "data_size": 65536 00:14:09.158 } 00:14:09.158 ] 00:14:09.158 }' 00:14:09.158 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.158 15:50:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.755 15:50:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:09.755 [2024-07-12 15:50:30.116747] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.755 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.014 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.014 "name": "Existed_Raid", 00:14:10.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.014 "strip_size_kb": 64, 00:14:10.014 "state": "configuring", 00:14:10.014 "raid_level": "raid0", 00:14:10.014 "superblock": false, 00:14:10.014 "num_base_bdevs": 3, 00:14:10.014 "num_base_bdevs_discovered": 1, 00:14:10.014 "num_base_bdevs_operational": 3, 00:14:10.014 "base_bdevs_list": [ 00:14:10.014 { 00:14:10.014 "name": "BaseBdev1", 00:14:10.014 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:10.014 "is_configured": false, 00:14:10.014 "data_offset": 0, 00:14:10.014 "data_size": 0 00:14:10.014 }, 00:14:10.014 { 00:14:10.014 "name": null, 00:14:10.014 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:10.014 "is_configured": false, 00:14:10.014 "data_offset": 0, 00:14:10.014 "data_size": 65536 00:14:10.014 }, 00:14:10.014 { 00:14:10.014 "name": "BaseBdev3", 00:14:10.014 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:10.014 "is_configured": true, 00:14:10.014 "data_offset": 0, 00:14:10.014 "data_size": 65536 00:14:10.014 } 00:14:10.014 ] 00:14:10.014 }' 00:14:10.014 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.014 15:50:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:10.581 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.581 15:50:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:10.841 [2024-07-12 15:50:31.220374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:10.841 BaseBdev1 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:10.841 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:11.411 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:11.671 [ 00:14:11.671 { 00:14:11.671 "name": "BaseBdev1", 00:14:11.671 "aliases": [ 00:14:11.671 "c056908d-2e36-4fb1-a15c-7f719271e5ca" 00:14:11.671 ], 00:14:11.671 "product_name": "Malloc disk", 00:14:11.671 "block_size": 512, 00:14:11.671 "num_blocks": 65536, 00:14:11.671 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:11.671 "assigned_rate_limits": { 00:14:11.671 "rw_ios_per_sec": 0, 00:14:11.671 "rw_mbytes_per_sec": 0, 00:14:11.671 "r_mbytes_per_sec": 0, 00:14:11.671 "w_mbytes_per_sec": 0 00:14:11.671 }, 00:14:11.671 "claimed": true, 00:14:11.671 "claim_type": "exclusive_write", 00:14:11.671 "zoned": false, 00:14:11.671 "supported_io_types": { 00:14:11.671 "read": true, 00:14:11.671 "write": true, 00:14:11.671 "unmap": true, 00:14:11.671 "flush": true, 00:14:11.671 "reset": true, 00:14:11.671 "nvme_admin": false, 00:14:11.671 "nvme_io": false, 00:14:11.671 "nvme_io_md": false, 00:14:11.671 "write_zeroes": true, 00:14:11.671 "zcopy": true, 00:14:11.671 "get_zone_info": false, 00:14:11.671 "zone_management": false, 00:14:11.671 "zone_append": false, 00:14:11.671 "compare": false, 00:14:11.671 "compare_and_write": false, 00:14:11.671 "abort": true, 00:14:11.671 "seek_hole": false, 00:14:11.671 "seek_data": false, 00:14:11.671 "copy": true, 00:14:11.671 "nvme_iov_md": false 00:14:11.671 }, 00:14:11.671 "memory_domains": [ 00:14:11.671 { 00:14:11.671 "dma_device_id": "system", 00:14:11.671 "dma_device_type": 1 00:14:11.671 }, 00:14:11.671 { 00:14:11.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.671 "dma_device_type": 2 00:14:11.671 } 00:14:11.671 ], 00:14:11.671 "driver_specific": {} 00:14:11.671 } 00:14:11.671 ] 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.671 15:50:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:11.931 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.931 "name": "Existed_Raid", 00:14:11.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:11.932 "strip_size_kb": 64, 00:14:11.932 "state": "configuring", 00:14:11.932 "raid_level": "raid0", 00:14:11.932 "superblock": false, 00:14:11.932 "num_base_bdevs": 3, 00:14:11.932 "num_base_bdevs_discovered": 2, 00:14:11.932 "num_base_bdevs_operational": 3, 00:14:11.932 "base_bdevs_list": [ 00:14:11.932 { 00:14:11.932 "name": "BaseBdev1", 00:14:11.932 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:11.932 "is_configured": true, 00:14:11.932 "data_offset": 0, 00:14:11.932 "data_size": 65536 00:14:11.932 }, 00:14:11.932 { 00:14:11.932 "name": null, 00:14:11.932 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:11.932 "is_configured": false, 00:14:11.932 "data_offset": 0, 00:14:11.932 "data_size": 65536 00:14:11.932 }, 00:14:11.932 { 00:14:11.932 "name": "BaseBdev3", 00:14:11.932 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:11.932 "is_configured": true, 00:14:11.932 "data_offset": 0, 00:14:11.932 "data_size": 65536 00:14:11.932 } 00:14:11.932 ] 00:14:11.932 }' 00:14:11.932 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.932 15:50:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.501 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.501 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:12.501 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:12.501 15:50:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:12.760 [2024-07-12 15:50:33.077142] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.760 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.020 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.020 "name": "Existed_Raid", 00:14:13.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:13.020 "strip_size_kb": 64, 00:14:13.020 "state": "configuring", 00:14:13.020 "raid_level": "raid0", 00:14:13.020 "superblock": false, 00:14:13.020 "num_base_bdevs": 3, 00:14:13.020 "num_base_bdevs_discovered": 1, 00:14:13.020 "num_base_bdevs_operational": 3, 00:14:13.020 "base_bdevs_list": [ 00:14:13.020 { 00:14:13.020 "name": "BaseBdev1", 00:14:13.020 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:13.020 "is_configured": true, 00:14:13.020 "data_offset": 0, 00:14:13.020 "data_size": 65536 00:14:13.020 }, 00:14:13.020 { 00:14:13.020 "name": null, 00:14:13.020 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:13.020 "is_configured": false, 00:14:13.020 "data_offset": 0, 00:14:13.020 "data_size": 65536 00:14:13.020 }, 00:14:13.020 { 00:14:13.020 "name": null, 00:14:13.020 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:13.020 "is_configured": false, 00:14:13.020 "data_offset": 0, 00:14:13.020 "data_size": 65536 00:14:13.020 } 00:14:13.020 ] 00:14:13.020 }' 00:14:13.020 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.020 15:50:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:13.592 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.592 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:13.592 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:13.592 15:50:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:13.852 [2024-07-12 15:50:34.171933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.852 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.853 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:14.113 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.113 "name": "Existed_Raid", 00:14:14.113 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:14.113 "strip_size_kb": 64, 00:14:14.113 "state": "configuring", 00:14:14.113 "raid_level": "raid0", 00:14:14.113 "superblock": false, 00:14:14.113 "num_base_bdevs": 3, 00:14:14.113 "num_base_bdevs_discovered": 2, 00:14:14.113 "num_base_bdevs_operational": 3, 00:14:14.113 "base_bdevs_list": [ 00:14:14.113 { 00:14:14.113 "name": "BaseBdev1", 00:14:14.113 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:14.113 "is_configured": true, 00:14:14.113 "data_offset": 0, 00:14:14.113 "data_size": 65536 00:14:14.113 }, 00:14:14.113 { 00:14:14.113 "name": null, 00:14:14.113 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:14.113 "is_configured": false, 00:14:14.113 "data_offset": 0, 00:14:14.113 "data_size": 65536 00:14:14.113 }, 00:14:14.113 { 00:14:14.113 "name": "BaseBdev3", 00:14:14.113 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:14.113 "is_configured": true, 00:14:14.113 "data_offset": 0, 00:14:14.113 "data_size": 65536 00:14:14.113 } 00:14:14.113 ] 00:14:14.113 }' 00:14:14.113 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.113 15:50:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:14.683 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.683 15:50:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:14.683 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:14.683 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:15.252 [2024-07-12 15:50:35.631698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.252 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:15.512 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.512 "name": "Existed_Raid", 00:14:15.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:15.512 "strip_size_kb": 64, 00:14:15.512 "state": "configuring", 00:14:15.512 "raid_level": "raid0", 00:14:15.512 "superblock": false, 00:14:15.512 "num_base_bdevs": 3, 00:14:15.512 "num_base_bdevs_discovered": 1, 00:14:15.512 "num_base_bdevs_operational": 3, 00:14:15.512 "base_bdevs_list": [ 00:14:15.512 { 00:14:15.512 "name": null, 00:14:15.512 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:15.512 "is_configured": false, 00:14:15.512 "data_offset": 0, 00:14:15.512 "data_size": 65536 00:14:15.512 }, 00:14:15.512 { 00:14:15.512 "name": null, 00:14:15.512 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:15.512 "is_configured": false, 00:14:15.512 "data_offset": 0, 00:14:15.512 "data_size": 65536 00:14:15.512 }, 00:14:15.512 { 00:14:15.512 "name": "BaseBdev3", 00:14:15.512 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:15.512 "is_configured": true, 00:14:15.512 "data_offset": 0, 00:14:15.512 "data_size": 65536 00:14:15.512 } 00:14:15.512 ] 00:14:15.512 }' 00:14:15.512 15:50:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.512 15:50:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:16.081 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.081 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:16.340 [2024-07-12 15:50:36.748326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.340 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:16.601 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.601 "name": "Existed_Raid", 00:14:16.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:16.601 "strip_size_kb": 64, 00:14:16.601 "state": "configuring", 00:14:16.601 "raid_level": "raid0", 00:14:16.601 "superblock": false, 00:14:16.601 "num_base_bdevs": 3, 00:14:16.601 "num_base_bdevs_discovered": 2, 00:14:16.601 "num_base_bdevs_operational": 3, 00:14:16.601 "base_bdevs_list": [ 00:14:16.601 { 00:14:16.601 "name": null, 00:14:16.601 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:16.601 "is_configured": false, 00:14:16.601 "data_offset": 0, 00:14:16.601 "data_size": 65536 00:14:16.601 }, 00:14:16.601 { 00:14:16.601 "name": "BaseBdev2", 00:14:16.601 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:16.601 "is_configured": true, 00:14:16.601 "data_offset": 0, 00:14:16.601 "data_size": 65536 00:14:16.601 }, 00:14:16.601 { 00:14:16.601 "name": "BaseBdev3", 00:14:16.601 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:16.601 "is_configured": true, 00:14:16.601 "data_offset": 0, 00:14:16.601 "data_size": 65536 00:14:16.601 } 00:14:16.601 ] 00:14:16.601 }' 00:14:16.601 15:50:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.601 15:50:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.170 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.170 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:17.429 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:17.429 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.430 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:17.690 15:50:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c056908d-2e36-4fb1-a15c-7f719271e5ca 00:14:17.690 [2024-07-12 15:50:38.116642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:17.690 [2024-07-12 15:50:38.116666] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20ef9d0 00:14:17.690 [2024-07-12 15:50:38.116671] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:17.690 [2024-07-12 15:50:38.116821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1de9290 00:14:17.690 [2024-07-12 15:50:38.116910] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20ef9d0 00:14:17.690 [2024-07-12 15:50:38.116916] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20ef9d0 00:14:17.690 [2024-07-12 15:50:38.117036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:17.690 NewBaseBdev 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:17.690 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:17.950 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:18.210 [ 00:14:18.210 { 00:14:18.210 "name": "NewBaseBdev", 00:14:18.210 "aliases": [ 00:14:18.210 "c056908d-2e36-4fb1-a15c-7f719271e5ca" 00:14:18.210 ], 00:14:18.210 "product_name": "Malloc disk", 00:14:18.210 "block_size": 512, 00:14:18.210 "num_blocks": 65536, 00:14:18.210 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:18.210 "assigned_rate_limits": { 00:14:18.210 "rw_ios_per_sec": 0, 00:14:18.210 "rw_mbytes_per_sec": 0, 00:14:18.210 "r_mbytes_per_sec": 0, 00:14:18.210 "w_mbytes_per_sec": 0 00:14:18.210 }, 00:14:18.210 "claimed": true, 00:14:18.210 "claim_type": "exclusive_write", 00:14:18.210 "zoned": false, 00:14:18.210 "supported_io_types": { 00:14:18.210 "read": true, 00:14:18.210 "write": true, 00:14:18.210 "unmap": true, 00:14:18.210 "flush": true, 00:14:18.210 "reset": true, 00:14:18.210 "nvme_admin": false, 00:14:18.210 "nvme_io": false, 00:14:18.210 "nvme_io_md": false, 00:14:18.210 "write_zeroes": true, 00:14:18.210 "zcopy": true, 00:14:18.210 "get_zone_info": false, 00:14:18.210 "zone_management": false, 00:14:18.210 "zone_append": false, 00:14:18.210 "compare": false, 00:14:18.210 "compare_and_write": false, 00:14:18.210 "abort": true, 00:14:18.210 "seek_hole": false, 00:14:18.210 "seek_data": false, 00:14:18.210 "copy": true, 00:14:18.210 "nvme_iov_md": false 00:14:18.210 }, 00:14:18.210 "memory_domains": [ 00:14:18.210 { 00:14:18.210 "dma_device_id": "system", 00:14:18.210 "dma_device_type": 1 00:14:18.210 }, 00:14:18.210 { 00:14:18.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.210 "dma_device_type": 2 00:14:18.210 } 00:14:18.210 ], 00:14:18.210 "driver_specific": {} 00:14:18.210 } 00:14:18.210 ] 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:18.210 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:18.470 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:18.470 "name": "Existed_Raid", 00:14:18.470 "uuid": "8f97257b-4f8a-460b-885f-a802dfe1acac", 00:14:18.470 "strip_size_kb": 64, 00:14:18.470 "state": "online", 00:14:18.470 "raid_level": "raid0", 00:14:18.470 "superblock": false, 00:14:18.470 "num_base_bdevs": 3, 00:14:18.470 "num_base_bdevs_discovered": 3, 00:14:18.470 "num_base_bdevs_operational": 3, 00:14:18.470 "base_bdevs_list": [ 00:14:18.470 { 00:14:18.470 "name": "NewBaseBdev", 00:14:18.470 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:18.470 "is_configured": true, 00:14:18.470 "data_offset": 0, 00:14:18.470 "data_size": 65536 00:14:18.470 }, 00:14:18.470 { 00:14:18.470 "name": "BaseBdev2", 00:14:18.470 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:18.470 "is_configured": true, 00:14:18.470 "data_offset": 0, 00:14:18.470 "data_size": 65536 00:14:18.470 }, 00:14:18.470 { 00:14:18.470 "name": "BaseBdev3", 00:14:18.470 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:18.470 "is_configured": true, 00:14:18.470 "data_offset": 0, 00:14:18.470 "data_size": 65536 00:14:18.470 } 00:14:18.470 ] 00:14:18.470 }' 00:14:18.470 15:50:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:18.470 15:50:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:19.040 [2024-07-12 15:50:39.404137] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:19.040 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:19.040 "name": "Existed_Raid", 00:14:19.040 "aliases": [ 00:14:19.040 "8f97257b-4f8a-460b-885f-a802dfe1acac" 00:14:19.040 ], 00:14:19.040 "product_name": "Raid Volume", 00:14:19.040 "block_size": 512, 00:14:19.040 "num_blocks": 196608, 00:14:19.040 "uuid": "8f97257b-4f8a-460b-885f-a802dfe1acac", 00:14:19.040 "assigned_rate_limits": { 00:14:19.040 "rw_ios_per_sec": 0, 00:14:19.040 "rw_mbytes_per_sec": 0, 00:14:19.040 "r_mbytes_per_sec": 0, 00:14:19.040 "w_mbytes_per_sec": 0 00:14:19.040 }, 00:14:19.040 "claimed": false, 00:14:19.040 "zoned": false, 00:14:19.040 "supported_io_types": { 00:14:19.040 "read": true, 00:14:19.040 "write": true, 00:14:19.041 "unmap": true, 00:14:19.041 "flush": true, 00:14:19.041 "reset": true, 00:14:19.041 "nvme_admin": false, 00:14:19.041 "nvme_io": false, 00:14:19.041 "nvme_io_md": false, 00:14:19.041 "write_zeroes": true, 00:14:19.041 "zcopy": false, 00:14:19.041 "get_zone_info": false, 00:14:19.041 "zone_management": false, 00:14:19.041 "zone_append": false, 00:14:19.041 "compare": false, 00:14:19.041 "compare_and_write": false, 00:14:19.041 "abort": false, 00:14:19.041 "seek_hole": false, 00:14:19.041 "seek_data": false, 00:14:19.041 "copy": false, 00:14:19.041 "nvme_iov_md": false 00:14:19.041 }, 00:14:19.041 "memory_domains": [ 00:14:19.041 { 00:14:19.041 "dma_device_id": "system", 00:14:19.041 "dma_device_type": 1 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.041 "dma_device_type": 2 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "dma_device_id": "system", 00:14:19.041 "dma_device_type": 1 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.041 "dma_device_type": 2 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "dma_device_id": "system", 00:14:19.041 "dma_device_type": 1 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.041 "dma_device_type": 2 00:14:19.041 } 00:14:19.041 ], 00:14:19.041 "driver_specific": { 00:14:19.041 "raid": { 00:14:19.041 "uuid": "8f97257b-4f8a-460b-885f-a802dfe1acac", 00:14:19.041 "strip_size_kb": 64, 00:14:19.041 "state": "online", 00:14:19.041 "raid_level": "raid0", 00:14:19.041 "superblock": false, 00:14:19.041 "num_base_bdevs": 3, 00:14:19.041 "num_base_bdevs_discovered": 3, 00:14:19.041 "num_base_bdevs_operational": 3, 00:14:19.041 "base_bdevs_list": [ 00:14:19.041 { 00:14:19.041 "name": "NewBaseBdev", 00:14:19.041 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:19.041 "is_configured": true, 00:14:19.041 "data_offset": 0, 00:14:19.041 "data_size": 65536 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "name": "BaseBdev2", 00:14:19.041 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:19.041 "is_configured": true, 00:14:19.041 "data_offset": 0, 00:14:19.041 "data_size": 65536 00:14:19.041 }, 00:14:19.041 { 00:14:19.041 "name": "BaseBdev3", 00:14:19.041 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:19.041 "is_configured": true, 00:14:19.041 "data_offset": 0, 00:14:19.041 "data_size": 65536 00:14:19.041 } 00:14:19.041 ] 00:14:19.041 } 00:14:19.041 } 00:14:19.041 }' 00:14:19.041 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:19.041 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:19.041 BaseBdev2 00:14:19.041 BaseBdev3' 00:14:19.041 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:19.041 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:19.041 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.301 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.301 "name": "NewBaseBdev", 00:14:19.301 "aliases": [ 00:14:19.301 "c056908d-2e36-4fb1-a15c-7f719271e5ca" 00:14:19.301 ], 00:14:19.301 "product_name": "Malloc disk", 00:14:19.301 "block_size": 512, 00:14:19.301 "num_blocks": 65536, 00:14:19.301 "uuid": "c056908d-2e36-4fb1-a15c-7f719271e5ca", 00:14:19.301 "assigned_rate_limits": { 00:14:19.301 "rw_ios_per_sec": 0, 00:14:19.301 "rw_mbytes_per_sec": 0, 00:14:19.301 "r_mbytes_per_sec": 0, 00:14:19.301 "w_mbytes_per_sec": 0 00:14:19.301 }, 00:14:19.301 "claimed": true, 00:14:19.301 "claim_type": "exclusive_write", 00:14:19.301 "zoned": false, 00:14:19.301 "supported_io_types": { 00:14:19.301 "read": true, 00:14:19.301 "write": true, 00:14:19.301 "unmap": true, 00:14:19.301 "flush": true, 00:14:19.301 "reset": true, 00:14:19.301 "nvme_admin": false, 00:14:19.301 "nvme_io": false, 00:14:19.301 "nvme_io_md": false, 00:14:19.301 "write_zeroes": true, 00:14:19.301 "zcopy": true, 00:14:19.301 "get_zone_info": false, 00:14:19.301 "zone_management": false, 00:14:19.301 "zone_append": false, 00:14:19.301 "compare": false, 00:14:19.301 "compare_and_write": false, 00:14:19.301 "abort": true, 00:14:19.301 "seek_hole": false, 00:14:19.301 "seek_data": false, 00:14:19.301 "copy": true, 00:14:19.301 "nvme_iov_md": false 00:14:19.301 }, 00:14:19.301 "memory_domains": [ 00:14:19.301 { 00:14:19.301 "dma_device_id": "system", 00:14:19.301 "dma_device_type": 1 00:14:19.301 }, 00:14:19.301 { 00:14:19.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.301 "dma_device_type": 2 00:14:19.301 } 00:14:19.301 ], 00:14:19.301 "driver_specific": {} 00:14:19.301 }' 00:14:19.301 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.301 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.560 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:19.561 15:50:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.820 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.820 "name": "BaseBdev2", 00:14:19.820 "aliases": [ 00:14:19.820 "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb" 00:14:19.820 ], 00:14:19.820 "product_name": "Malloc disk", 00:14:19.820 "block_size": 512, 00:14:19.820 "num_blocks": 65536, 00:14:19.820 "uuid": "26d23a1d-28c0-4aaf-b0b1-2dabbee21ceb", 00:14:19.820 "assigned_rate_limits": { 00:14:19.820 "rw_ios_per_sec": 0, 00:14:19.820 "rw_mbytes_per_sec": 0, 00:14:19.820 "r_mbytes_per_sec": 0, 00:14:19.820 "w_mbytes_per_sec": 0 00:14:19.820 }, 00:14:19.820 "claimed": true, 00:14:19.821 "claim_type": "exclusive_write", 00:14:19.821 "zoned": false, 00:14:19.821 "supported_io_types": { 00:14:19.821 "read": true, 00:14:19.821 "write": true, 00:14:19.821 "unmap": true, 00:14:19.821 "flush": true, 00:14:19.821 "reset": true, 00:14:19.821 "nvme_admin": false, 00:14:19.821 "nvme_io": false, 00:14:19.821 "nvme_io_md": false, 00:14:19.821 "write_zeroes": true, 00:14:19.821 "zcopy": true, 00:14:19.821 "get_zone_info": false, 00:14:19.821 "zone_management": false, 00:14:19.821 "zone_append": false, 00:14:19.821 "compare": false, 00:14:19.821 "compare_and_write": false, 00:14:19.821 "abort": true, 00:14:19.821 "seek_hole": false, 00:14:19.821 "seek_data": false, 00:14:19.821 "copy": true, 00:14:19.821 "nvme_iov_md": false 00:14:19.821 }, 00:14:19.821 "memory_domains": [ 00:14:19.821 { 00:14:19.821 "dma_device_id": "system", 00:14:19.821 "dma_device_type": 1 00:14:19.821 }, 00:14:19.821 { 00:14:19.821 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.821 "dma_device_type": 2 00:14:19.821 } 00:14:19.821 ], 00:14:19.821 "driver_specific": {} 00:14:19.821 }' 00:14:19.821 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.821 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.821 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.821 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:20.080 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:20.340 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:20.340 "name": "BaseBdev3", 00:14:20.340 "aliases": [ 00:14:20.340 "450e393d-3e7c-4c67-a9af-e64aa628e5c3" 00:14:20.340 ], 00:14:20.340 "product_name": "Malloc disk", 00:14:20.340 "block_size": 512, 00:14:20.340 "num_blocks": 65536, 00:14:20.340 "uuid": "450e393d-3e7c-4c67-a9af-e64aa628e5c3", 00:14:20.340 "assigned_rate_limits": { 00:14:20.340 "rw_ios_per_sec": 0, 00:14:20.340 "rw_mbytes_per_sec": 0, 00:14:20.340 "r_mbytes_per_sec": 0, 00:14:20.340 "w_mbytes_per_sec": 0 00:14:20.340 }, 00:14:20.340 "claimed": true, 00:14:20.340 "claim_type": "exclusive_write", 00:14:20.340 "zoned": false, 00:14:20.340 "supported_io_types": { 00:14:20.340 "read": true, 00:14:20.340 "write": true, 00:14:20.340 "unmap": true, 00:14:20.340 "flush": true, 00:14:20.341 "reset": true, 00:14:20.341 "nvme_admin": false, 00:14:20.341 "nvme_io": false, 00:14:20.341 "nvme_io_md": false, 00:14:20.341 "write_zeroes": true, 00:14:20.341 "zcopy": true, 00:14:20.341 "get_zone_info": false, 00:14:20.341 "zone_management": false, 00:14:20.341 "zone_append": false, 00:14:20.341 "compare": false, 00:14:20.341 "compare_and_write": false, 00:14:20.341 "abort": true, 00:14:20.341 "seek_hole": false, 00:14:20.341 "seek_data": false, 00:14:20.341 "copy": true, 00:14:20.341 "nvme_iov_md": false 00:14:20.341 }, 00:14:20.341 "memory_domains": [ 00:14:20.341 { 00:14:20.341 "dma_device_id": "system", 00:14:20.341 "dma_device_type": 1 00:14:20.341 }, 00:14:20.341 { 00:14:20.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:20.341 "dma_device_type": 2 00:14:20.341 } 00:14:20.341 ], 00:14:20.341 "driver_specific": {} 00:14:20.341 }' 00:14:20.341 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.341 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.600 15:50:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.600 15:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:20.860 [2024-07-12 15:50:41.248583] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:20.860 [2024-07-12 15:50:41.248600] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:20.860 [2024-07-12 15:50:41.248641] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:20.860 [2024-07-12 15:50:41.248678] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:20.860 [2024-07-12 15:50:41.248685] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20ef9d0 name Existed_Raid, state offline 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2521621 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2521621 ']' 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2521621 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:20.860 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2521621 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2521621' 00:14:21.120 killing process with pid 2521621 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2521621 00:14:21.120 [2024-07-12 15:50:41.319252] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2521621 00:14:21.120 [2024-07-12 15:50:41.333921] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:21.120 15:50:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:14:21.120 00:14:21.120 real 0m24.604s 00:14:21.120 user 0m46.137s 00:14:21.120 sys 0m3.586s 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.121 ************************************ 00:14:21.121 END TEST raid_state_function_test 00:14:21.121 ************************************ 00:14:21.121 15:50:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:21.121 15:50:41 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:14:21.121 15:50:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:21.121 15:50:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:21.121 15:50:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:21.121 ************************************ 00:14:21.121 START TEST raid_state_function_test_sb 00:14:21.121 ************************************ 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2526342 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2526342' 00:14:21.121 Process raid pid: 2526342 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2526342 /var/tmp/spdk-raid.sock 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2526342 ']' 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:21.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:21.121 15:50:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:21.381 [2024-07-12 15:50:41.594284] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:14:21.381 [2024-07-12 15:50:41.594338] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:21.381 [2024-07-12 15:50:41.692523] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:21.381 [2024-07-12 15:50:41.767034] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:21.381 [2024-07-12 15:50:41.814068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:21.381 [2024-07-12 15:50:41.814092] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:22.320 [2024-07-12 15:50:42.609566] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:22.320 [2024-07-12 15:50:42.609596] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:22.320 [2024-07-12 15:50:42.609602] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:22.320 [2024-07-12 15:50:42.609608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:22.320 [2024-07-12 15:50:42.609613] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:22.320 [2024-07-12 15:50:42.609618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.320 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:22.580 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.580 "name": "Existed_Raid", 00:14:22.580 "uuid": "61863375-7d90-43ad-975f-9d929c884f7b", 00:14:22.580 "strip_size_kb": 64, 00:14:22.580 "state": "configuring", 00:14:22.580 "raid_level": "raid0", 00:14:22.580 "superblock": true, 00:14:22.580 "num_base_bdevs": 3, 00:14:22.580 "num_base_bdevs_discovered": 0, 00:14:22.580 "num_base_bdevs_operational": 3, 00:14:22.580 "base_bdevs_list": [ 00:14:22.580 { 00:14:22.580 "name": "BaseBdev1", 00:14:22.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.580 "is_configured": false, 00:14:22.580 "data_offset": 0, 00:14:22.580 "data_size": 0 00:14:22.580 }, 00:14:22.580 { 00:14:22.580 "name": "BaseBdev2", 00:14:22.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.580 "is_configured": false, 00:14:22.580 "data_offset": 0, 00:14:22.580 "data_size": 0 00:14:22.580 }, 00:14:22.580 { 00:14:22.580 "name": "BaseBdev3", 00:14:22.580 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.580 "is_configured": false, 00:14:22.580 "data_offset": 0, 00:14:22.580 "data_size": 0 00:14:22.580 } 00:14:22.580 ] 00:14:22.580 }' 00:14:22.580 15:50:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.580 15:50:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:23.148 15:50:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:23.148 [2024-07-12 15:50:43.547819] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:23.148 [2024-07-12 15:50:43.547836] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c0900 name Existed_Raid, state configuring 00:14:23.148 15:50:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:23.409 [2024-07-12 15:50:43.736329] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:23.409 [2024-07-12 15:50:43.736348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:23.409 [2024-07-12 15:50:43.736354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:23.409 [2024-07-12 15:50:43.736360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:23.409 [2024-07-12 15:50:43.736364] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:23.409 [2024-07-12 15:50:43.736369] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:23.409 15:50:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:23.668 [2024-07-12 15:50:43.931498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:23.668 BaseBdev1 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:23.668 15:50:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:23.928 [ 00:14:23.928 { 00:14:23.928 "name": "BaseBdev1", 00:14:23.928 "aliases": [ 00:14:23.928 "d391d338-1df6-4b81-8d32-17f96bcf419d" 00:14:23.928 ], 00:14:23.928 "product_name": "Malloc disk", 00:14:23.928 "block_size": 512, 00:14:23.928 "num_blocks": 65536, 00:14:23.928 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:23.928 "assigned_rate_limits": { 00:14:23.928 "rw_ios_per_sec": 0, 00:14:23.928 "rw_mbytes_per_sec": 0, 00:14:23.928 "r_mbytes_per_sec": 0, 00:14:23.928 "w_mbytes_per_sec": 0 00:14:23.928 }, 00:14:23.928 "claimed": true, 00:14:23.928 "claim_type": "exclusive_write", 00:14:23.928 "zoned": false, 00:14:23.928 "supported_io_types": { 00:14:23.928 "read": true, 00:14:23.928 "write": true, 00:14:23.928 "unmap": true, 00:14:23.928 "flush": true, 00:14:23.928 "reset": true, 00:14:23.928 "nvme_admin": false, 00:14:23.928 "nvme_io": false, 00:14:23.928 "nvme_io_md": false, 00:14:23.928 "write_zeroes": true, 00:14:23.928 "zcopy": true, 00:14:23.928 "get_zone_info": false, 00:14:23.928 "zone_management": false, 00:14:23.928 "zone_append": false, 00:14:23.928 "compare": false, 00:14:23.928 "compare_and_write": false, 00:14:23.928 "abort": true, 00:14:23.928 "seek_hole": false, 00:14:23.928 "seek_data": false, 00:14:23.928 "copy": true, 00:14:23.928 "nvme_iov_md": false 00:14:23.928 }, 00:14:23.928 "memory_domains": [ 00:14:23.928 { 00:14:23.928 "dma_device_id": "system", 00:14:23.928 "dma_device_type": 1 00:14:23.928 }, 00:14:23.928 { 00:14:23.928 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.928 "dma_device_type": 2 00:14:23.928 } 00:14:23.928 ], 00:14:23.928 "driver_specific": {} 00:14:23.928 } 00:14:23.928 ] 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.928 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:24.188 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.188 "name": "Existed_Raid", 00:14:24.188 "uuid": "bcd53401-0867-40e5-9900-cd1f3aeb962a", 00:14:24.188 "strip_size_kb": 64, 00:14:24.188 "state": "configuring", 00:14:24.188 "raid_level": "raid0", 00:14:24.188 "superblock": true, 00:14:24.188 "num_base_bdevs": 3, 00:14:24.188 "num_base_bdevs_discovered": 1, 00:14:24.188 "num_base_bdevs_operational": 3, 00:14:24.188 "base_bdevs_list": [ 00:14:24.188 { 00:14:24.188 "name": "BaseBdev1", 00:14:24.188 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:24.188 "is_configured": true, 00:14:24.188 "data_offset": 2048, 00:14:24.188 "data_size": 63488 00:14:24.188 }, 00:14:24.188 { 00:14:24.188 "name": "BaseBdev2", 00:14:24.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.188 "is_configured": false, 00:14:24.188 "data_offset": 0, 00:14:24.188 "data_size": 0 00:14:24.188 }, 00:14:24.188 { 00:14:24.188 "name": "BaseBdev3", 00:14:24.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.188 "is_configured": false, 00:14:24.188 "data_offset": 0, 00:14:24.188 "data_size": 0 00:14:24.188 } 00:14:24.188 ] 00:14:24.188 }' 00:14:24.188 15:50:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.188 15:50:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:24.758 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:25.019 [2024-07-12 15:50:45.226766] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:25.019 [2024-07-12 15:50:45.226797] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c0190 name Existed_Raid, state configuring 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:25.019 [2024-07-12 15:50:45.423297] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:25.019 [2024-07-12 15:50:45.424407] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:25.019 [2024-07-12 15:50:45.424430] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:25.019 [2024-07-12 15:50:45.424436] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:25.019 [2024-07-12 15:50:45.424442] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.019 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:25.279 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.279 "name": "Existed_Raid", 00:14:25.279 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:25.279 "strip_size_kb": 64, 00:14:25.279 "state": "configuring", 00:14:25.279 "raid_level": "raid0", 00:14:25.279 "superblock": true, 00:14:25.279 "num_base_bdevs": 3, 00:14:25.279 "num_base_bdevs_discovered": 1, 00:14:25.279 "num_base_bdevs_operational": 3, 00:14:25.279 "base_bdevs_list": [ 00:14:25.279 { 00:14:25.279 "name": "BaseBdev1", 00:14:25.279 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:25.279 "is_configured": true, 00:14:25.279 "data_offset": 2048, 00:14:25.279 "data_size": 63488 00:14:25.279 }, 00:14:25.279 { 00:14:25.279 "name": "BaseBdev2", 00:14:25.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.279 "is_configured": false, 00:14:25.279 "data_offset": 0, 00:14:25.279 "data_size": 0 00:14:25.279 }, 00:14:25.279 { 00:14:25.279 "name": "BaseBdev3", 00:14:25.279 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:25.279 "is_configured": false, 00:14:25.279 "data_offset": 0, 00:14:25.279 "data_size": 0 00:14:25.279 } 00:14:25.279 ] 00:14:25.279 }' 00:14:25.279 15:50:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.279 15:50:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:25.850 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:26.110 [2024-07-12 15:50:46.342361] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:26.110 BaseBdev2 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:26.110 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:26.370 [ 00:14:26.370 { 00:14:26.370 "name": "BaseBdev2", 00:14:26.370 "aliases": [ 00:14:26.370 "ad18ff92-8141-4a3e-8f59-6bd77240f01d" 00:14:26.370 ], 00:14:26.370 "product_name": "Malloc disk", 00:14:26.370 "block_size": 512, 00:14:26.370 "num_blocks": 65536, 00:14:26.370 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:26.370 "assigned_rate_limits": { 00:14:26.370 "rw_ios_per_sec": 0, 00:14:26.370 "rw_mbytes_per_sec": 0, 00:14:26.370 "r_mbytes_per_sec": 0, 00:14:26.370 "w_mbytes_per_sec": 0 00:14:26.370 }, 00:14:26.370 "claimed": true, 00:14:26.370 "claim_type": "exclusive_write", 00:14:26.370 "zoned": false, 00:14:26.370 "supported_io_types": { 00:14:26.370 "read": true, 00:14:26.370 "write": true, 00:14:26.370 "unmap": true, 00:14:26.370 "flush": true, 00:14:26.370 "reset": true, 00:14:26.370 "nvme_admin": false, 00:14:26.370 "nvme_io": false, 00:14:26.370 "nvme_io_md": false, 00:14:26.370 "write_zeroes": true, 00:14:26.370 "zcopy": true, 00:14:26.370 "get_zone_info": false, 00:14:26.370 "zone_management": false, 00:14:26.370 "zone_append": false, 00:14:26.370 "compare": false, 00:14:26.370 "compare_and_write": false, 00:14:26.370 "abort": true, 00:14:26.370 "seek_hole": false, 00:14:26.370 "seek_data": false, 00:14:26.370 "copy": true, 00:14:26.370 "nvme_iov_md": false 00:14:26.370 }, 00:14:26.370 "memory_domains": [ 00:14:26.370 { 00:14:26.370 "dma_device_id": "system", 00:14:26.370 "dma_device_type": 1 00:14:26.370 }, 00:14:26.370 { 00:14:26.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:26.370 "dma_device_type": 2 00:14:26.370 } 00:14:26.370 ], 00:14:26.370 "driver_specific": {} 00:14:26.370 } 00:14:26.370 ] 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:26.370 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:26.669 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:26.669 "name": "Existed_Raid", 00:14:26.669 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:26.669 "strip_size_kb": 64, 00:14:26.669 "state": "configuring", 00:14:26.669 "raid_level": "raid0", 00:14:26.669 "superblock": true, 00:14:26.669 "num_base_bdevs": 3, 00:14:26.669 "num_base_bdevs_discovered": 2, 00:14:26.669 "num_base_bdevs_operational": 3, 00:14:26.669 "base_bdevs_list": [ 00:14:26.669 { 00:14:26.669 "name": "BaseBdev1", 00:14:26.669 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:26.669 "is_configured": true, 00:14:26.669 "data_offset": 2048, 00:14:26.669 "data_size": 63488 00:14:26.669 }, 00:14:26.669 { 00:14:26.669 "name": "BaseBdev2", 00:14:26.669 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:26.669 "is_configured": true, 00:14:26.669 "data_offset": 2048, 00:14:26.669 "data_size": 63488 00:14:26.669 }, 00:14:26.669 { 00:14:26.669 "name": "BaseBdev3", 00:14:26.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:26.669 "is_configured": false, 00:14:26.669 "data_offset": 0, 00:14:26.669 "data_size": 0 00:14:26.669 } 00:14:26.669 ] 00:14:26.669 }' 00:14:26.669 15:50:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:26.670 15:50:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:27.240 15:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:27.500 [2024-07-12 15:50:47.718779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:27.500 [2024-07-12 15:50:47.718899] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c1280 00:14:27.500 [2024-07-12 15:50:47.718907] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:27.500 [2024-07-12 15:50:47.719046] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c0d70 00:14:27.500 [2024-07-12 15:50:47.719138] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c1280 00:14:27.500 [2024-07-12 15:50:47.719143] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14c1280 00:14:27.500 [2024-07-12 15:50:47.719212] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:27.500 BaseBdev3 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:27.500 15:50:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:27.760 [ 00:14:27.760 { 00:14:27.760 "name": "BaseBdev3", 00:14:27.760 "aliases": [ 00:14:27.760 "e775f5b1-cf0d-4f05-a004-41b1b7a47a35" 00:14:27.760 ], 00:14:27.760 "product_name": "Malloc disk", 00:14:27.760 "block_size": 512, 00:14:27.760 "num_blocks": 65536, 00:14:27.760 "uuid": "e775f5b1-cf0d-4f05-a004-41b1b7a47a35", 00:14:27.760 "assigned_rate_limits": { 00:14:27.760 "rw_ios_per_sec": 0, 00:14:27.760 "rw_mbytes_per_sec": 0, 00:14:27.760 "r_mbytes_per_sec": 0, 00:14:27.760 "w_mbytes_per_sec": 0 00:14:27.760 }, 00:14:27.760 "claimed": true, 00:14:27.760 "claim_type": "exclusive_write", 00:14:27.760 "zoned": false, 00:14:27.760 "supported_io_types": { 00:14:27.760 "read": true, 00:14:27.760 "write": true, 00:14:27.760 "unmap": true, 00:14:27.760 "flush": true, 00:14:27.760 "reset": true, 00:14:27.760 "nvme_admin": false, 00:14:27.760 "nvme_io": false, 00:14:27.760 "nvme_io_md": false, 00:14:27.760 "write_zeroes": true, 00:14:27.760 "zcopy": true, 00:14:27.760 "get_zone_info": false, 00:14:27.760 "zone_management": false, 00:14:27.760 "zone_append": false, 00:14:27.760 "compare": false, 00:14:27.760 "compare_and_write": false, 00:14:27.760 "abort": true, 00:14:27.760 "seek_hole": false, 00:14:27.760 "seek_data": false, 00:14:27.760 "copy": true, 00:14:27.760 "nvme_iov_md": false 00:14:27.760 }, 00:14:27.760 "memory_domains": [ 00:14:27.760 { 00:14:27.760 "dma_device_id": "system", 00:14:27.760 "dma_device_type": 1 00:14:27.760 }, 00:14:27.760 { 00:14:27.760 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:27.760 "dma_device_type": 2 00:14:27.760 } 00:14:27.760 ], 00:14:27.760 "driver_specific": {} 00:14:27.760 } 00:14:27.760 ] 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.760 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:28.020 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.020 "name": "Existed_Raid", 00:14:28.020 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:28.020 "strip_size_kb": 64, 00:14:28.020 "state": "online", 00:14:28.020 "raid_level": "raid0", 00:14:28.020 "superblock": true, 00:14:28.020 "num_base_bdevs": 3, 00:14:28.020 "num_base_bdevs_discovered": 3, 00:14:28.020 "num_base_bdevs_operational": 3, 00:14:28.020 "base_bdevs_list": [ 00:14:28.020 { 00:14:28.020 "name": "BaseBdev1", 00:14:28.020 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:28.020 "is_configured": true, 00:14:28.020 "data_offset": 2048, 00:14:28.020 "data_size": 63488 00:14:28.020 }, 00:14:28.020 { 00:14:28.020 "name": "BaseBdev2", 00:14:28.020 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:28.020 "is_configured": true, 00:14:28.020 "data_offset": 2048, 00:14:28.020 "data_size": 63488 00:14:28.020 }, 00:14:28.020 { 00:14:28.020 "name": "BaseBdev3", 00:14:28.020 "uuid": "e775f5b1-cf0d-4f05-a004-41b1b7a47a35", 00:14:28.020 "is_configured": true, 00:14:28.020 "data_offset": 2048, 00:14:28.020 "data_size": 63488 00:14:28.020 } 00:14:28.020 ] 00:14:28.020 }' 00:14:28.020 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.020 15:50:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:28.590 15:50:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:28.590 [2024-07-12 15:50:49.030351] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:28.852 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:28.852 "name": "Existed_Raid", 00:14:28.852 "aliases": [ 00:14:28.852 "0c1bb11b-62cc-4fe1-bb6e-396900019e76" 00:14:28.852 ], 00:14:28.852 "product_name": "Raid Volume", 00:14:28.852 "block_size": 512, 00:14:28.852 "num_blocks": 190464, 00:14:28.852 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:28.852 "assigned_rate_limits": { 00:14:28.852 "rw_ios_per_sec": 0, 00:14:28.852 "rw_mbytes_per_sec": 0, 00:14:28.852 "r_mbytes_per_sec": 0, 00:14:28.852 "w_mbytes_per_sec": 0 00:14:28.852 }, 00:14:28.852 "claimed": false, 00:14:28.852 "zoned": false, 00:14:28.852 "supported_io_types": { 00:14:28.852 "read": true, 00:14:28.852 "write": true, 00:14:28.852 "unmap": true, 00:14:28.852 "flush": true, 00:14:28.852 "reset": true, 00:14:28.852 "nvme_admin": false, 00:14:28.852 "nvme_io": false, 00:14:28.852 "nvme_io_md": false, 00:14:28.852 "write_zeroes": true, 00:14:28.852 "zcopy": false, 00:14:28.852 "get_zone_info": false, 00:14:28.852 "zone_management": false, 00:14:28.852 "zone_append": false, 00:14:28.852 "compare": false, 00:14:28.852 "compare_and_write": false, 00:14:28.852 "abort": false, 00:14:28.852 "seek_hole": false, 00:14:28.852 "seek_data": false, 00:14:28.852 "copy": false, 00:14:28.852 "nvme_iov_md": false 00:14:28.852 }, 00:14:28.852 "memory_domains": [ 00:14:28.852 { 00:14:28.852 "dma_device_id": "system", 00:14:28.852 "dma_device_type": 1 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.852 "dma_device_type": 2 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "dma_device_id": "system", 00:14:28.852 "dma_device_type": 1 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.852 "dma_device_type": 2 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "dma_device_id": "system", 00:14:28.852 "dma_device_type": 1 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.852 "dma_device_type": 2 00:14:28.852 } 00:14:28.852 ], 00:14:28.852 "driver_specific": { 00:14:28.852 "raid": { 00:14:28.852 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:28.852 "strip_size_kb": 64, 00:14:28.852 "state": "online", 00:14:28.852 "raid_level": "raid0", 00:14:28.852 "superblock": true, 00:14:28.852 "num_base_bdevs": 3, 00:14:28.852 "num_base_bdevs_discovered": 3, 00:14:28.852 "num_base_bdevs_operational": 3, 00:14:28.852 "base_bdevs_list": [ 00:14:28.852 { 00:14:28.852 "name": "BaseBdev1", 00:14:28.852 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:28.852 "is_configured": true, 00:14:28.852 "data_offset": 2048, 00:14:28.852 "data_size": 63488 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "name": "BaseBdev2", 00:14:28.852 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:28.852 "is_configured": true, 00:14:28.852 "data_offset": 2048, 00:14:28.852 "data_size": 63488 00:14:28.852 }, 00:14:28.852 { 00:14:28.852 "name": "BaseBdev3", 00:14:28.852 "uuid": "e775f5b1-cf0d-4f05-a004-41b1b7a47a35", 00:14:28.852 "is_configured": true, 00:14:28.852 "data_offset": 2048, 00:14:28.853 "data_size": 63488 00:14:28.853 } 00:14:28.853 ] 00:14:28.853 } 00:14:28.853 } 00:14:28.853 }' 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:28.853 BaseBdev2 00:14:28.853 BaseBdev3' 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:28.853 "name": "BaseBdev1", 00:14:28.853 "aliases": [ 00:14:28.853 "d391d338-1df6-4b81-8d32-17f96bcf419d" 00:14:28.853 ], 00:14:28.853 "product_name": "Malloc disk", 00:14:28.853 "block_size": 512, 00:14:28.853 "num_blocks": 65536, 00:14:28.853 "uuid": "d391d338-1df6-4b81-8d32-17f96bcf419d", 00:14:28.853 "assigned_rate_limits": { 00:14:28.853 "rw_ios_per_sec": 0, 00:14:28.853 "rw_mbytes_per_sec": 0, 00:14:28.853 "r_mbytes_per_sec": 0, 00:14:28.853 "w_mbytes_per_sec": 0 00:14:28.853 }, 00:14:28.853 "claimed": true, 00:14:28.853 "claim_type": "exclusive_write", 00:14:28.853 "zoned": false, 00:14:28.853 "supported_io_types": { 00:14:28.853 "read": true, 00:14:28.853 "write": true, 00:14:28.853 "unmap": true, 00:14:28.853 "flush": true, 00:14:28.853 "reset": true, 00:14:28.853 "nvme_admin": false, 00:14:28.853 "nvme_io": false, 00:14:28.853 "nvme_io_md": false, 00:14:28.853 "write_zeroes": true, 00:14:28.853 "zcopy": true, 00:14:28.853 "get_zone_info": false, 00:14:28.853 "zone_management": false, 00:14:28.853 "zone_append": false, 00:14:28.853 "compare": false, 00:14:28.853 "compare_and_write": false, 00:14:28.853 "abort": true, 00:14:28.853 "seek_hole": false, 00:14:28.853 "seek_data": false, 00:14:28.853 "copy": true, 00:14:28.853 "nvme_iov_md": false 00:14:28.853 }, 00:14:28.853 "memory_domains": [ 00:14:28.853 { 00:14:28.853 "dma_device_id": "system", 00:14:28.853 "dma_device_type": 1 00:14:28.853 }, 00:14:28.853 { 00:14:28.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:28.853 "dma_device_type": 2 00:14:28.853 } 00:14:28.853 ], 00:14:28.853 "driver_specific": {} 00:14:28.853 }' 00:14:28.853 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.114 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:29.373 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:29.634 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:29.634 "name": "BaseBdev2", 00:14:29.634 "aliases": [ 00:14:29.634 "ad18ff92-8141-4a3e-8f59-6bd77240f01d" 00:14:29.634 ], 00:14:29.634 "product_name": "Malloc disk", 00:14:29.634 "block_size": 512, 00:14:29.634 "num_blocks": 65536, 00:14:29.634 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:29.634 "assigned_rate_limits": { 00:14:29.634 "rw_ios_per_sec": 0, 00:14:29.634 "rw_mbytes_per_sec": 0, 00:14:29.634 "r_mbytes_per_sec": 0, 00:14:29.634 "w_mbytes_per_sec": 0 00:14:29.634 }, 00:14:29.634 "claimed": true, 00:14:29.634 "claim_type": "exclusive_write", 00:14:29.634 "zoned": false, 00:14:29.634 "supported_io_types": { 00:14:29.634 "read": true, 00:14:29.634 "write": true, 00:14:29.634 "unmap": true, 00:14:29.634 "flush": true, 00:14:29.634 "reset": true, 00:14:29.634 "nvme_admin": false, 00:14:29.634 "nvme_io": false, 00:14:29.634 "nvme_io_md": false, 00:14:29.634 "write_zeroes": true, 00:14:29.634 "zcopy": true, 00:14:29.634 "get_zone_info": false, 00:14:29.634 "zone_management": false, 00:14:29.634 "zone_append": false, 00:14:29.634 "compare": false, 00:14:29.634 "compare_and_write": false, 00:14:29.634 "abort": true, 00:14:29.634 "seek_hole": false, 00:14:29.634 "seek_data": false, 00:14:29.634 "copy": true, 00:14:29.634 "nvme_iov_md": false 00:14:29.634 }, 00:14:29.634 "memory_domains": [ 00:14:29.634 { 00:14:29.634 "dma_device_id": "system", 00:14:29.634 "dma_device_type": 1 00:14:29.634 }, 00:14:29.634 { 00:14:29.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:29.634 "dma_device_type": 2 00:14:29.634 } 00:14:29.634 ], 00:14:29.634 "driver_specific": {} 00:14:29.634 }' 00:14:29.634 15:50:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.634 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:29.897 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:30.156 "name": "BaseBdev3", 00:14:30.156 "aliases": [ 00:14:30.156 "e775f5b1-cf0d-4f05-a004-41b1b7a47a35" 00:14:30.156 ], 00:14:30.156 "product_name": "Malloc disk", 00:14:30.156 "block_size": 512, 00:14:30.156 "num_blocks": 65536, 00:14:30.156 "uuid": "e775f5b1-cf0d-4f05-a004-41b1b7a47a35", 00:14:30.156 "assigned_rate_limits": { 00:14:30.156 "rw_ios_per_sec": 0, 00:14:30.156 "rw_mbytes_per_sec": 0, 00:14:30.156 "r_mbytes_per_sec": 0, 00:14:30.156 "w_mbytes_per_sec": 0 00:14:30.156 }, 00:14:30.156 "claimed": true, 00:14:30.156 "claim_type": "exclusive_write", 00:14:30.156 "zoned": false, 00:14:30.156 "supported_io_types": { 00:14:30.156 "read": true, 00:14:30.156 "write": true, 00:14:30.156 "unmap": true, 00:14:30.156 "flush": true, 00:14:30.156 "reset": true, 00:14:30.156 "nvme_admin": false, 00:14:30.156 "nvme_io": false, 00:14:30.156 "nvme_io_md": false, 00:14:30.156 "write_zeroes": true, 00:14:30.156 "zcopy": true, 00:14:30.156 "get_zone_info": false, 00:14:30.156 "zone_management": false, 00:14:30.156 "zone_append": false, 00:14:30.156 "compare": false, 00:14:30.156 "compare_and_write": false, 00:14:30.156 "abort": true, 00:14:30.156 "seek_hole": false, 00:14:30.156 "seek_data": false, 00:14:30.156 "copy": true, 00:14:30.156 "nvme_iov_md": false 00:14:30.156 }, 00:14:30.156 "memory_domains": [ 00:14:30.156 { 00:14:30.156 "dma_device_id": "system", 00:14:30.156 "dma_device_type": 1 00:14:30.156 }, 00:14:30.156 { 00:14:30.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:30.156 "dma_device_type": 2 00:14:30.156 } 00:14:30.156 ], 00:14:30.156 "driver_specific": {} 00:14:30.156 }' 00:14:30.156 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.416 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:30.676 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:30.676 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.676 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:30.676 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:30.676 15:50:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:31.245 [2024-07-12 15:50:51.488366] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:31.245 [2024-07-12 15:50:51.488388] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.245 [2024-07-12 15:50:51.488419] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.245 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:31.504 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.504 "name": "Existed_Raid", 00:14:31.504 "uuid": "0c1bb11b-62cc-4fe1-bb6e-396900019e76", 00:14:31.504 "strip_size_kb": 64, 00:14:31.504 "state": "offline", 00:14:31.504 "raid_level": "raid0", 00:14:31.504 "superblock": true, 00:14:31.504 "num_base_bdevs": 3, 00:14:31.504 "num_base_bdevs_discovered": 2, 00:14:31.504 "num_base_bdevs_operational": 2, 00:14:31.504 "base_bdevs_list": [ 00:14:31.504 { 00:14:31.504 "name": null, 00:14:31.504 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:31.504 "is_configured": false, 00:14:31.504 "data_offset": 2048, 00:14:31.504 "data_size": 63488 00:14:31.504 }, 00:14:31.504 { 00:14:31.504 "name": "BaseBdev2", 00:14:31.504 "uuid": "ad18ff92-8141-4a3e-8f59-6bd77240f01d", 00:14:31.504 "is_configured": true, 00:14:31.504 "data_offset": 2048, 00:14:31.504 "data_size": 63488 00:14:31.504 }, 00:14:31.504 { 00:14:31.504 "name": "BaseBdev3", 00:14:31.504 "uuid": "e775f5b1-cf0d-4f05-a004-41b1b7a47a35", 00:14:31.504 "is_configured": true, 00:14:31.504 "data_offset": 2048, 00:14:31.504 "data_size": 63488 00:14:31.504 } 00:14:31.504 ] 00:14:31.504 }' 00:14:31.504 15:50:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.504 15:50:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:32.072 15:50:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:32.640 [2024-07-12 15:50:53.008220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:32.640 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:32.640 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:32.640 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:32.640 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:33.207 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:33.207 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:33.207 15:50:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:33.777 [2024-07-12 15:50:54.096766] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:33.777 [2024-07-12 15:50:54.096799] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c1280 name Existed_Raid, state offline 00:14:33.777 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:33.777 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:33.777 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:33.777 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:34.036 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:34.296 BaseBdev2 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:34.296 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:34.556 [ 00:14:34.556 { 00:14:34.556 "name": "BaseBdev2", 00:14:34.556 "aliases": [ 00:14:34.556 "0e3ed68d-4526-4c4d-99e7-9e443250ab36" 00:14:34.556 ], 00:14:34.556 "product_name": "Malloc disk", 00:14:34.556 "block_size": 512, 00:14:34.556 "num_blocks": 65536, 00:14:34.556 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:34.556 "assigned_rate_limits": { 00:14:34.556 "rw_ios_per_sec": 0, 00:14:34.556 "rw_mbytes_per_sec": 0, 00:14:34.556 "r_mbytes_per_sec": 0, 00:14:34.556 "w_mbytes_per_sec": 0 00:14:34.556 }, 00:14:34.556 "claimed": false, 00:14:34.556 "zoned": false, 00:14:34.556 "supported_io_types": { 00:14:34.556 "read": true, 00:14:34.556 "write": true, 00:14:34.556 "unmap": true, 00:14:34.556 "flush": true, 00:14:34.556 "reset": true, 00:14:34.556 "nvme_admin": false, 00:14:34.556 "nvme_io": false, 00:14:34.556 "nvme_io_md": false, 00:14:34.556 "write_zeroes": true, 00:14:34.556 "zcopy": true, 00:14:34.556 "get_zone_info": false, 00:14:34.556 "zone_management": false, 00:14:34.556 "zone_append": false, 00:14:34.556 "compare": false, 00:14:34.556 "compare_and_write": false, 00:14:34.556 "abort": true, 00:14:34.556 "seek_hole": false, 00:14:34.556 "seek_data": false, 00:14:34.556 "copy": true, 00:14:34.556 "nvme_iov_md": false 00:14:34.556 }, 00:14:34.556 "memory_domains": [ 00:14:34.556 { 00:14:34.556 "dma_device_id": "system", 00:14:34.556 "dma_device_type": 1 00:14:34.556 }, 00:14:34.556 { 00:14:34.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:34.556 "dma_device_type": 2 00:14:34.556 } 00:14:34.556 ], 00:14:34.556 "driver_specific": {} 00:14:34.556 } 00:14:34.556 ] 00:14:34.556 15:50:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:34.556 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:34.556 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:34.556 15:50:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:34.816 BaseBdev3 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:34.816 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:35.076 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:35.645 [ 00:14:35.645 { 00:14:35.645 "name": "BaseBdev3", 00:14:35.645 "aliases": [ 00:14:35.645 "73a060a2-bed3-4240-9449-05aa9131f7c4" 00:14:35.645 ], 00:14:35.645 "product_name": "Malloc disk", 00:14:35.645 "block_size": 512, 00:14:35.645 "num_blocks": 65536, 00:14:35.645 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:35.645 "assigned_rate_limits": { 00:14:35.645 "rw_ios_per_sec": 0, 00:14:35.645 "rw_mbytes_per_sec": 0, 00:14:35.645 "r_mbytes_per_sec": 0, 00:14:35.645 "w_mbytes_per_sec": 0 00:14:35.645 }, 00:14:35.645 "claimed": false, 00:14:35.645 "zoned": false, 00:14:35.645 "supported_io_types": { 00:14:35.645 "read": true, 00:14:35.645 "write": true, 00:14:35.645 "unmap": true, 00:14:35.645 "flush": true, 00:14:35.645 "reset": true, 00:14:35.645 "nvme_admin": false, 00:14:35.645 "nvme_io": false, 00:14:35.645 "nvme_io_md": false, 00:14:35.645 "write_zeroes": true, 00:14:35.645 "zcopy": true, 00:14:35.645 "get_zone_info": false, 00:14:35.645 "zone_management": false, 00:14:35.645 "zone_append": false, 00:14:35.645 "compare": false, 00:14:35.645 "compare_and_write": false, 00:14:35.645 "abort": true, 00:14:35.645 "seek_hole": false, 00:14:35.645 "seek_data": false, 00:14:35.645 "copy": true, 00:14:35.645 "nvme_iov_md": false 00:14:35.645 }, 00:14:35.645 "memory_domains": [ 00:14:35.645 { 00:14:35.645 "dma_device_id": "system", 00:14:35.645 "dma_device_type": 1 00:14:35.645 }, 00:14:35.645 { 00:14:35.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:35.645 "dma_device_type": 2 00:14:35.645 } 00:14:35.645 ], 00:14:35.645 "driver_specific": {} 00:14:35.645 } 00:14:35.645 ] 00:14:35.645 15:50:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:35.645 15:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:35.645 15:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:35.645 15:50:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:35.905 [2024-07-12 15:50:56.105747] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:35.905 [2024-07-12 15:50:56.105776] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:35.905 [2024-07-12 15:50:56.105788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:35.905 [2024-07-12 15:50:56.106816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:35.905 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:36.475 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:36.475 "name": "Existed_Raid", 00:14:36.475 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:36.475 "strip_size_kb": 64, 00:14:36.475 "state": "configuring", 00:14:36.475 "raid_level": "raid0", 00:14:36.475 "superblock": true, 00:14:36.475 "num_base_bdevs": 3, 00:14:36.475 "num_base_bdevs_discovered": 2, 00:14:36.475 "num_base_bdevs_operational": 3, 00:14:36.475 "base_bdevs_list": [ 00:14:36.475 { 00:14:36.475 "name": "BaseBdev1", 00:14:36.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:36.475 "is_configured": false, 00:14:36.475 "data_offset": 0, 00:14:36.475 "data_size": 0 00:14:36.475 }, 00:14:36.475 { 00:14:36.475 "name": "BaseBdev2", 00:14:36.475 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:36.475 "is_configured": true, 00:14:36.475 "data_offset": 2048, 00:14:36.475 "data_size": 63488 00:14:36.475 }, 00:14:36.475 { 00:14:36.475 "name": "BaseBdev3", 00:14:36.475 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:36.475 "is_configured": true, 00:14:36.475 "data_offset": 2048, 00:14:36.475 "data_size": 63488 00:14:36.475 } 00:14:36.475 ] 00:14:36.475 }' 00:14:36.475 15:50:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:36.475 15:50:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:37.044 [2024-07-12 15:50:57.364884] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.044 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.305 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.305 "name": "Existed_Raid", 00:14:37.305 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:37.305 "strip_size_kb": 64, 00:14:37.305 "state": "configuring", 00:14:37.305 "raid_level": "raid0", 00:14:37.305 "superblock": true, 00:14:37.305 "num_base_bdevs": 3, 00:14:37.305 "num_base_bdevs_discovered": 1, 00:14:37.305 "num_base_bdevs_operational": 3, 00:14:37.305 "base_bdevs_list": [ 00:14:37.305 { 00:14:37.305 "name": "BaseBdev1", 00:14:37.305 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.305 "is_configured": false, 00:14:37.305 "data_offset": 0, 00:14:37.305 "data_size": 0 00:14:37.305 }, 00:14:37.305 { 00:14:37.305 "name": null, 00:14:37.305 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:37.305 "is_configured": false, 00:14:37.305 "data_offset": 2048, 00:14:37.305 "data_size": 63488 00:14:37.305 }, 00:14:37.305 { 00:14:37.305 "name": "BaseBdev3", 00:14:37.305 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:37.305 "is_configured": true, 00:14:37.305 "data_offset": 2048, 00:14:37.305 "data_size": 63488 00:14:37.305 } 00:14:37.305 ] 00:14:37.305 }' 00:14:37.305 15:50:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.305 15:50:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:37.874 15:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.874 15:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:38.134 [2024-07-12 15:50:58.512850] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:38.134 BaseBdev1 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:38.134 15:50:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.703 15:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:39.272 [ 00:14:39.272 { 00:14:39.272 "name": "BaseBdev1", 00:14:39.272 "aliases": [ 00:14:39.272 "c112e58b-5728-42c5-aea4-16375a6b793f" 00:14:39.272 ], 00:14:39.272 "product_name": "Malloc disk", 00:14:39.272 "block_size": 512, 00:14:39.272 "num_blocks": 65536, 00:14:39.272 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:39.272 "assigned_rate_limits": { 00:14:39.272 "rw_ios_per_sec": 0, 00:14:39.272 "rw_mbytes_per_sec": 0, 00:14:39.272 "r_mbytes_per_sec": 0, 00:14:39.272 "w_mbytes_per_sec": 0 00:14:39.272 }, 00:14:39.272 "claimed": true, 00:14:39.272 "claim_type": "exclusive_write", 00:14:39.272 "zoned": false, 00:14:39.272 "supported_io_types": { 00:14:39.272 "read": true, 00:14:39.272 "write": true, 00:14:39.272 "unmap": true, 00:14:39.272 "flush": true, 00:14:39.272 "reset": true, 00:14:39.272 "nvme_admin": false, 00:14:39.272 "nvme_io": false, 00:14:39.272 "nvme_io_md": false, 00:14:39.272 "write_zeroes": true, 00:14:39.272 "zcopy": true, 00:14:39.272 "get_zone_info": false, 00:14:39.272 "zone_management": false, 00:14:39.272 "zone_append": false, 00:14:39.272 "compare": false, 00:14:39.272 "compare_and_write": false, 00:14:39.272 "abort": true, 00:14:39.272 "seek_hole": false, 00:14:39.272 "seek_data": false, 00:14:39.272 "copy": true, 00:14:39.272 "nvme_iov_md": false 00:14:39.272 }, 00:14:39.272 "memory_domains": [ 00:14:39.272 { 00:14:39.272 "dma_device_id": "system", 00:14:39.272 "dma_device_type": 1 00:14:39.272 }, 00:14:39.272 { 00:14:39.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:39.272 "dma_device_type": 2 00:14:39.272 } 00:14:39.272 ], 00:14:39.272 "driver_specific": {} 00:14:39.272 } 00:14:39.272 ] 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.272 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.532 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.532 "name": "Existed_Raid", 00:14:39.532 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:39.532 "strip_size_kb": 64, 00:14:39.532 "state": "configuring", 00:14:39.532 "raid_level": "raid0", 00:14:39.532 "superblock": true, 00:14:39.532 "num_base_bdevs": 3, 00:14:39.532 "num_base_bdevs_discovered": 2, 00:14:39.532 "num_base_bdevs_operational": 3, 00:14:39.532 "base_bdevs_list": [ 00:14:39.532 { 00:14:39.532 "name": "BaseBdev1", 00:14:39.532 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:39.532 "is_configured": true, 00:14:39.532 "data_offset": 2048, 00:14:39.532 "data_size": 63488 00:14:39.532 }, 00:14:39.532 { 00:14:39.532 "name": null, 00:14:39.532 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:39.532 "is_configured": false, 00:14:39.532 "data_offset": 2048, 00:14:39.532 "data_size": 63488 00:14:39.532 }, 00:14:39.532 { 00:14:39.532 "name": "BaseBdev3", 00:14:39.532 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:39.532 "is_configured": true, 00:14:39.532 "data_offset": 2048, 00:14:39.532 "data_size": 63488 00:14:39.532 } 00:14:39.532 ] 00:14:39.532 }' 00:14:39.532 15:50:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.532 15:50:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:40.100 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.100 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:40.360 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:40.361 [2024-07-12 15:51:00.734480] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.361 15:51:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.930 15:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.930 "name": "Existed_Raid", 00:14:40.930 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:40.930 "strip_size_kb": 64, 00:14:40.930 "state": "configuring", 00:14:40.930 "raid_level": "raid0", 00:14:40.930 "superblock": true, 00:14:40.930 "num_base_bdevs": 3, 00:14:40.930 "num_base_bdevs_discovered": 1, 00:14:40.930 "num_base_bdevs_operational": 3, 00:14:40.930 "base_bdevs_list": [ 00:14:40.930 { 00:14:40.930 "name": "BaseBdev1", 00:14:40.930 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:40.931 "is_configured": true, 00:14:40.931 "data_offset": 2048, 00:14:40.931 "data_size": 63488 00:14:40.931 }, 00:14:40.931 { 00:14:40.931 "name": null, 00:14:40.931 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:40.931 "is_configured": false, 00:14:40.931 "data_offset": 2048, 00:14:40.931 "data_size": 63488 00:14:40.931 }, 00:14:40.931 { 00:14:40.931 "name": null, 00:14:40.931 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:40.931 "is_configured": false, 00:14:40.931 "data_offset": 2048, 00:14:40.931 "data_size": 63488 00:14:40.931 } 00:14:40.931 ] 00:14:40.931 }' 00:14:40.931 15:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.931 15:51:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:41.501 15:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:41.501 15:51:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.762 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:41.762 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:42.021 [2024-07-12 15:51:02.214255] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.021 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.021 "name": "Existed_Raid", 00:14:42.021 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:42.021 "strip_size_kb": 64, 00:14:42.021 "state": "configuring", 00:14:42.021 "raid_level": "raid0", 00:14:42.021 "superblock": true, 00:14:42.021 "num_base_bdevs": 3, 00:14:42.021 "num_base_bdevs_discovered": 2, 00:14:42.021 "num_base_bdevs_operational": 3, 00:14:42.021 "base_bdevs_list": [ 00:14:42.021 { 00:14:42.021 "name": "BaseBdev1", 00:14:42.021 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:42.021 "is_configured": true, 00:14:42.021 "data_offset": 2048, 00:14:42.022 "data_size": 63488 00:14:42.022 }, 00:14:42.022 { 00:14:42.022 "name": null, 00:14:42.022 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:42.022 "is_configured": false, 00:14:42.022 "data_offset": 2048, 00:14:42.022 "data_size": 63488 00:14:42.022 }, 00:14:42.022 { 00:14:42.022 "name": "BaseBdev3", 00:14:42.022 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:42.022 "is_configured": true, 00:14:42.022 "data_offset": 2048, 00:14:42.022 "data_size": 63488 00:14:42.022 } 00:14:42.022 ] 00:14:42.022 }' 00:14:42.022 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.022 15:51:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:42.590 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:42.590 15:51:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:42.848 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:42.848 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:43.107 [2024-07-12 15:51:03.309043] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.107 "name": "Existed_Raid", 00:14:43.107 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:43.107 "strip_size_kb": 64, 00:14:43.107 "state": "configuring", 00:14:43.107 "raid_level": "raid0", 00:14:43.107 "superblock": true, 00:14:43.107 "num_base_bdevs": 3, 00:14:43.107 "num_base_bdevs_discovered": 1, 00:14:43.107 "num_base_bdevs_operational": 3, 00:14:43.107 "base_bdevs_list": [ 00:14:43.107 { 00:14:43.107 "name": null, 00:14:43.107 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:43.107 "is_configured": false, 00:14:43.107 "data_offset": 2048, 00:14:43.107 "data_size": 63488 00:14:43.107 }, 00:14:43.107 { 00:14:43.107 "name": null, 00:14:43.107 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:43.107 "is_configured": false, 00:14:43.107 "data_offset": 2048, 00:14:43.107 "data_size": 63488 00:14:43.107 }, 00:14:43.107 { 00:14:43.107 "name": "BaseBdev3", 00:14:43.107 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:43.107 "is_configured": true, 00:14:43.107 "data_offset": 2048, 00:14:43.107 "data_size": 63488 00:14:43.107 } 00:14:43.107 ] 00:14:43.107 }' 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.107 15:51:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:43.719 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.719 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:43.979 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:43.979 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:44.239 [2024-07-12 15:51:04.453733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.239 "name": "Existed_Raid", 00:14:44.239 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:44.239 "strip_size_kb": 64, 00:14:44.239 "state": "configuring", 00:14:44.239 "raid_level": "raid0", 00:14:44.239 "superblock": true, 00:14:44.239 "num_base_bdevs": 3, 00:14:44.239 "num_base_bdevs_discovered": 2, 00:14:44.239 "num_base_bdevs_operational": 3, 00:14:44.239 "base_bdevs_list": [ 00:14:44.239 { 00:14:44.239 "name": null, 00:14:44.239 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:44.239 "is_configured": false, 00:14:44.239 "data_offset": 2048, 00:14:44.239 "data_size": 63488 00:14:44.239 }, 00:14:44.239 { 00:14:44.239 "name": "BaseBdev2", 00:14:44.239 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:44.239 "is_configured": true, 00:14:44.239 "data_offset": 2048, 00:14:44.239 "data_size": 63488 00:14:44.239 }, 00:14:44.239 { 00:14:44.239 "name": "BaseBdev3", 00:14:44.239 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:44.239 "is_configured": true, 00:14:44.239 "data_offset": 2048, 00:14:44.239 "data_size": 63488 00:14:44.239 } 00:14:44.239 ] 00:14:44.239 }' 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.239 15:51:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:44.809 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.809 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:45.068 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:45.068 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.068 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c112e58b-5728-42c5-aea4-16375a6b793f 00:14:45.328 [2024-07-12 15:51:05.753979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:45.328 [2024-07-12 15:51:05.754093] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14c4410 00:14:45.328 [2024-07-12 15:51:05.754101] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:45.328 [2024-07-12 15:51:05.754241] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14c3020 00:14:45.328 [2024-07-12 15:51:05.754334] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14c4410 00:14:45.328 [2024-07-12 15:51:05.754343] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14c4410 00:14:45.328 [2024-07-12 15:51:05.754417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:45.328 NewBaseBdev 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.328 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:45.587 15:51:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:45.846 [ 00:14:45.846 { 00:14:45.846 "name": "NewBaseBdev", 00:14:45.846 "aliases": [ 00:14:45.846 "c112e58b-5728-42c5-aea4-16375a6b793f" 00:14:45.846 ], 00:14:45.846 "product_name": "Malloc disk", 00:14:45.846 "block_size": 512, 00:14:45.846 "num_blocks": 65536, 00:14:45.846 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:45.846 "assigned_rate_limits": { 00:14:45.846 "rw_ios_per_sec": 0, 00:14:45.846 "rw_mbytes_per_sec": 0, 00:14:45.846 "r_mbytes_per_sec": 0, 00:14:45.846 "w_mbytes_per_sec": 0 00:14:45.846 }, 00:14:45.846 "claimed": true, 00:14:45.846 "claim_type": "exclusive_write", 00:14:45.846 "zoned": false, 00:14:45.846 "supported_io_types": { 00:14:45.846 "read": true, 00:14:45.846 "write": true, 00:14:45.846 "unmap": true, 00:14:45.846 "flush": true, 00:14:45.846 "reset": true, 00:14:45.846 "nvme_admin": false, 00:14:45.846 "nvme_io": false, 00:14:45.846 "nvme_io_md": false, 00:14:45.846 "write_zeroes": true, 00:14:45.846 "zcopy": true, 00:14:45.846 "get_zone_info": false, 00:14:45.846 "zone_management": false, 00:14:45.846 "zone_append": false, 00:14:45.846 "compare": false, 00:14:45.846 "compare_and_write": false, 00:14:45.846 "abort": true, 00:14:45.846 "seek_hole": false, 00:14:45.846 "seek_data": false, 00:14:45.846 "copy": true, 00:14:45.846 "nvme_iov_md": false 00:14:45.846 }, 00:14:45.846 "memory_domains": [ 00:14:45.846 { 00:14:45.846 "dma_device_id": "system", 00:14:45.846 "dma_device_type": 1 00:14:45.846 }, 00:14:45.846 { 00:14:45.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.846 "dma_device_type": 2 00:14:45.846 } 00:14:45.846 ], 00:14:45.846 "driver_specific": {} 00:14:45.846 } 00:14:45.846 ] 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:45.846 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:45.847 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.105 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.105 "name": "Existed_Raid", 00:14:46.105 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:46.105 "strip_size_kb": 64, 00:14:46.105 "state": "online", 00:14:46.105 "raid_level": "raid0", 00:14:46.105 "superblock": true, 00:14:46.105 "num_base_bdevs": 3, 00:14:46.105 "num_base_bdevs_discovered": 3, 00:14:46.105 "num_base_bdevs_operational": 3, 00:14:46.105 "base_bdevs_list": [ 00:14:46.105 { 00:14:46.105 "name": "NewBaseBdev", 00:14:46.105 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:46.105 "is_configured": true, 00:14:46.105 "data_offset": 2048, 00:14:46.105 "data_size": 63488 00:14:46.105 }, 00:14:46.105 { 00:14:46.105 "name": "BaseBdev2", 00:14:46.105 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:46.105 "is_configured": true, 00:14:46.105 "data_offset": 2048, 00:14:46.105 "data_size": 63488 00:14:46.105 }, 00:14:46.105 { 00:14:46.105 "name": "BaseBdev3", 00:14:46.105 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:46.105 "is_configured": true, 00:14:46.105 "data_offset": 2048, 00:14:46.105 "data_size": 63488 00:14:46.105 } 00:14:46.105 ] 00:14:46.105 }' 00:14:46.105 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.105 15:51:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:46.673 15:51:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:46.673 [2024-07-12 15:51:07.037455] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:46.673 "name": "Existed_Raid", 00:14:46.673 "aliases": [ 00:14:46.673 "d95dce7f-3b47-41b7-95d3-49f201d2cf5e" 00:14:46.673 ], 00:14:46.673 "product_name": "Raid Volume", 00:14:46.673 "block_size": 512, 00:14:46.673 "num_blocks": 190464, 00:14:46.673 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:46.673 "assigned_rate_limits": { 00:14:46.673 "rw_ios_per_sec": 0, 00:14:46.673 "rw_mbytes_per_sec": 0, 00:14:46.673 "r_mbytes_per_sec": 0, 00:14:46.673 "w_mbytes_per_sec": 0 00:14:46.673 }, 00:14:46.673 "claimed": false, 00:14:46.673 "zoned": false, 00:14:46.673 "supported_io_types": { 00:14:46.673 "read": true, 00:14:46.673 "write": true, 00:14:46.673 "unmap": true, 00:14:46.673 "flush": true, 00:14:46.673 "reset": true, 00:14:46.673 "nvme_admin": false, 00:14:46.673 "nvme_io": false, 00:14:46.673 "nvme_io_md": false, 00:14:46.673 "write_zeroes": true, 00:14:46.673 "zcopy": false, 00:14:46.673 "get_zone_info": false, 00:14:46.673 "zone_management": false, 00:14:46.673 "zone_append": false, 00:14:46.673 "compare": false, 00:14:46.673 "compare_and_write": false, 00:14:46.673 "abort": false, 00:14:46.673 "seek_hole": false, 00:14:46.673 "seek_data": false, 00:14:46.673 "copy": false, 00:14:46.673 "nvme_iov_md": false 00:14:46.673 }, 00:14:46.673 "memory_domains": [ 00:14:46.673 { 00:14:46.673 "dma_device_id": "system", 00:14:46.673 "dma_device_type": 1 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.673 "dma_device_type": 2 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "dma_device_id": "system", 00:14:46.673 "dma_device_type": 1 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.673 "dma_device_type": 2 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "dma_device_id": "system", 00:14:46.673 "dma_device_type": 1 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.673 "dma_device_type": 2 00:14:46.673 } 00:14:46.673 ], 00:14:46.673 "driver_specific": { 00:14:46.673 "raid": { 00:14:46.673 "uuid": "d95dce7f-3b47-41b7-95d3-49f201d2cf5e", 00:14:46.673 "strip_size_kb": 64, 00:14:46.673 "state": "online", 00:14:46.673 "raid_level": "raid0", 00:14:46.673 "superblock": true, 00:14:46.673 "num_base_bdevs": 3, 00:14:46.673 "num_base_bdevs_discovered": 3, 00:14:46.673 "num_base_bdevs_operational": 3, 00:14:46.673 "base_bdevs_list": [ 00:14:46.673 { 00:14:46.673 "name": "NewBaseBdev", 00:14:46.673 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:46.673 "is_configured": true, 00:14:46.673 "data_offset": 2048, 00:14:46.673 "data_size": 63488 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "name": "BaseBdev2", 00:14:46.673 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:46.673 "is_configured": true, 00:14:46.673 "data_offset": 2048, 00:14:46.673 "data_size": 63488 00:14:46.673 }, 00:14:46.673 { 00:14:46.673 "name": "BaseBdev3", 00:14:46.673 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:46.673 "is_configured": true, 00:14:46.673 "data_offset": 2048, 00:14:46.673 "data_size": 63488 00:14:46.673 } 00:14:46.673 ] 00:14:46.673 } 00:14:46.673 } 00:14:46.673 }' 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:46.673 BaseBdev2 00:14:46.673 BaseBdev3' 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:46.673 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:46.933 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:46.933 "name": "NewBaseBdev", 00:14:46.933 "aliases": [ 00:14:46.933 "c112e58b-5728-42c5-aea4-16375a6b793f" 00:14:46.933 ], 00:14:46.933 "product_name": "Malloc disk", 00:14:46.933 "block_size": 512, 00:14:46.933 "num_blocks": 65536, 00:14:46.933 "uuid": "c112e58b-5728-42c5-aea4-16375a6b793f", 00:14:46.933 "assigned_rate_limits": { 00:14:46.933 "rw_ios_per_sec": 0, 00:14:46.933 "rw_mbytes_per_sec": 0, 00:14:46.933 "r_mbytes_per_sec": 0, 00:14:46.933 "w_mbytes_per_sec": 0 00:14:46.933 }, 00:14:46.933 "claimed": true, 00:14:46.933 "claim_type": "exclusive_write", 00:14:46.933 "zoned": false, 00:14:46.933 "supported_io_types": { 00:14:46.933 "read": true, 00:14:46.933 "write": true, 00:14:46.933 "unmap": true, 00:14:46.933 "flush": true, 00:14:46.933 "reset": true, 00:14:46.933 "nvme_admin": false, 00:14:46.933 "nvme_io": false, 00:14:46.933 "nvme_io_md": false, 00:14:46.933 "write_zeroes": true, 00:14:46.933 "zcopy": true, 00:14:46.933 "get_zone_info": false, 00:14:46.933 "zone_management": false, 00:14:46.933 "zone_append": false, 00:14:46.933 "compare": false, 00:14:46.933 "compare_and_write": false, 00:14:46.933 "abort": true, 00:14:46.933 "seek_hole": false, 00:14:46.933 "seek_data": false, 00:14:46.933 "copy": true, 00:14:46.933 "nvme_iov_md": false 00:14:46.933 }, 00:14:46.933 "memory_domains": [ 00:14:46.933 { 00:14:46.933 "dma_device_id": "system", 00:14:46.933 "dma_device_type": 1 00:14:46.933 }, 00:14:46.933 { 00:14:46.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.933 "dma_device_type": 2 00:14:46.933 } 00:14:46.933 ], 00:14:46.933 "driver_specific": {} 00:14:46.933 }' 00:14:46.933 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.933 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.193 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.452 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:47.452 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.452 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:47.452 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:47.452 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:47.452 "name": "BaseBdev2", 00:14:47.452 "aliases": [ 00:14:47.452 "0e3ed68d-4526-4c4d-99e7-9e443250ab36" 00:14:47.452 ], 00:14:47.452 "product_name": "Malloc disk", 00:14:47.452 "block_size": 512, 00:14:47.452 "num_blocks": 65536, 00:14:47.452 "uuid": "0e3ed68d-4526-4c4d-99e7-9e443250ab36", 00:14:47.452 "assigned_rate_limits": { 00:14:47.453 "rw_ios_per_sec": 0, 00:14:47.453 "rw_mbytes_per_sec": 0, 00:14:47.453 "r_mbytes_per_sec": 0, 00:14:47.453 "w_mbytes_per_sec": 0 00:14:47.453 }, 00:14:47.453 "claimed": true, 00:14:47.453 "claim_type": "exclusive_write", 00:14:47.453 "zoned": false, 00:14:47.453 "supported_io_types": { 00:14:47.453 "read": true, 00:14:47.453 "write": true, 00:14:47.453 "unmap": true, 00:14:47.453 "flush": true, 00:14:47.453 "reset": true, 00:14:47.453 "nvme_admin": false, 00:14:47.453 "nvme_io": false, 00:14:47.453 "nvme_io_md": false, 00:14:47.453 "write_zeroes": true, 00:14:47.453 "zcopy": true, 00:14:47.453 "get_zone_info": false, 00:14:47.453 "zone_management": false, 00:14:47.453 "zone_append": false, 00:14:47.453 "compare": false, 00:14:47.453 "compare_and_write": false, 00:14:47.453 "abort": true, 00:14:47.453 "seek_hole": false, 00:14:47.453 "seek_data": false, 00:14:47.453 "copy": true, 00:14:47.453 "nvme_iov_md": false 00:14:47.453 }, 00:14:47.453 "memory_domains": [ 00:14:47.453 { 00:14:47.453 "dma_device_id": "system", 00:14:47.453 "dma_device_type": 1 00:14:47.453 }, 00:14:47.453 { 00:14:47.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.453 "dma_device_type": 2 00:14:47.453 } 00:14:47.453 ], 00:14:47.453 "driver_specific": {} 00:14:47.453 }' 00:14:47.453 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.712 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.712 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.712 15:51:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:47.712 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.971 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:47.971 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:47.971 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.971 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:47.971 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.231 "name": "BaseBdev3", 00:14:48.231 "aliases": [ 00:14:48.231 "73a060a2-bed3-4240-9449-05aa9131f7c4" 00:14:48.231 ], 00:14:48.231 "product_name": "Malloc disk", 00:14:48.231 "block_size": 512, 00:14:48.231 "num_blocks": 65536, 00:14:48.231 "uuid": "73a060a2-bed3-4240-9449-05aa9131f7c4", 00:14:48.231 "assigned_rate_limits": { 00:14:48.231 "rw_ios_per_sec": 0, 00:14:48.231 "rw_mbytes_per_sec": 0, 00:14:48.231 "r_mbytes_per_sec": 0, 00:14:48.231 "w_mbytes_per_sec": 0 00:14:48.231 }, 00:14:48.231 "claimed": true, 00:14:48.231 "claim_type": "exclusive_write", 00:14:48.231 "zoned": false, 00:14:48.231 "supported_io_types": { 00:14:48.231 "read": true, 00:14:48.231 "write": true, 00:14:48.231 "unmap": true, 00:14:48.231 "flush": true, 00:14:48.231 "reset": true, 00:14:48.231 "nvme_admin": false, 00:14:48.231 "nvme_io": false, 00:14:48.231 "nvme_io_md": false, 00:14:48.231 "write_zeroes": true, 00:14:48.231 "zcopy": true, 00:14:48.231 "get_zone_info": false, 00:14:48.231 "zone_management": false, 00:14:48.231 "zone_append": false, 00:14:48.231 "compare": false, 00:14:48.231 "compare_and_write": false, 00:14:48.231 "abort": true, 00:14:48.231 "seek_hole": false, 00:14:48.231 "seek_data": false, 00:14:48.231 "copy": true, 00:14:48.231 "nvme_iov_md": false 00:14:48.231 }, 00:14:48.231 "memory_domains": [ 00:14:48.231 { 00:14:48.231 "dma_device_id": "system", 00:14:48.231 "dma_device_type": 1 00:14:48.231 }, 00:14:48.231 { 00:14:48.231 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.231 "dma_device_type": 2 00:14:48.231 } 00:14:48.231 ], 00:14:48.231 "driver_specific": {} 00:14:48.231 }' 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.231 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.491 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.491 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.491 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.491 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.491 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:48.752 [2024-07-12 15:51:08.958096] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:48.752 [2024-07-12 15:51:08.958114] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:48.752 [2024-07-12 15:51:08.958151] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:48.752 [2024-07-12 15:51:08.958190] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:48.752 [2024-07-12 15:51:08.958196] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14c4410 name Existed_Raid, state offline 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2526342 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2526342 ']' 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2526342 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:48.752 15:51:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2526342 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2526342' 00:14:48.752 killing process with pid 2526342 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2526342 00:14:48.752 [2024-07-12 15:51:09.025496] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2526342 00:14:48.752 [2024-07-12 15:51:09.040228] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:48.752 00:14:48.752 real 0m27.632s 00:14:48.752 user 0m52.001s 00:14:48.752 sys 0m3.853s 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:48.752 15:51:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:48.752 ************************************ 00:14:48.752 END TEST raid_state_function_test_sb 00:14:48.752 ************************************ 00:14:49.012 15:51:09 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:49.012 15:51:09 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:49.012 15:51:09 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:49.012 15:51:09 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:49.012 15:51:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:49.012 ************************************ 00:14:49.012 START TEST raid_superblock_test 00:14:49.012 ************************************ 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2531729 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2531729 /var/tmp/spdk-raid.sock 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2531729 ']' 00:14:49.012 15:51:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:49.013 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:49.013 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:49.013 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:49.013 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:49.013 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:49.013 15:51:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:49.013 [2024-07-12 15:51:09.301692] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:14:49.013 [2024-07-12 15:51:09.301753] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2531729 ] 00:14:49.013 [2024-07-12 15:51:09.392876] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:49.272 [2024-07-12 15:51:09.462232] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:49.272 [2024-07-12 15:51:09.505731] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:49.272 [2024-07-12 15:51:09.505752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:49.841 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:50.101 malloc1 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:50.101 [2024-07-12 15:51:10.524631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:50.101 [2024-07-12 15:51:10.524666] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.101 [2024-07-12 15:51:10.524679] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa80b50 00:14:50.101 [2024-07-12 15:51:10.524685] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.101 [2024-07-12 15:51:10.526072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.101 [2024-07-12 15:51:10.526090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:50.101 pt1 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:50.101 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:50.360 malloc2 00:14:50.361 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:50.620 [2024-07-12 15:51:10.879488] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:50.620 [2024-07-12 15:51:10.879517] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.620 [2024-07-12 15:51:10.879526] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa81df0 00:14:50.620 [2024-07-12 15:51:10.879532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.620 [2024-07-12 15:51:10.880692] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.620 [2024-07-12 15:51:10.880718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:50.620 pt2 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:50.621 15:51:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:50.880 malloc3 00:14:50.880 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:50.880 [2024-07-12 15:51:11.266397] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:50.880 [2024-07-12 15:51:11.266426] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:50.880 [2024-07-12 15:51:11.266435] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa81770 00:14:50.880 [2024-07-12 15:51:11.266441] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:50.880 [2024-07-12 15:51:11.267627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:50.880 [2024-07-12 15:51:11.267645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:50.880 pt3 00:14:50.880 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:50.880 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:50.880 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:51.140 [2024-07-12 15:51:11.458902] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:51.140 [2024-07-12 15:51:11.459898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:51.140 [2024-07-12 15:51:11.459939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:51.140 [2024-07-12 15:51:11.460055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc26cb0 00:14:51.140 [2024-07-12 15:51:11.460062] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:51.140 [2024-07-12 15:51:11.460210] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa81600 00:14:51.140 [2024-07-12 15:51:11.460316] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc26cb0 00:14:51.140 [2024-07-12 15:51:11.460321] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc26cb0 00:14:51.140 [2024-07-12 15:51:11.460390] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.140 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:51.400 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.400 "name": "raid_bdev1", 00:14:51.400 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:51.400 "strip_size_kb": 64, 00:14:51.400 "state": "online", 00:14:51.400 "raid_level": "raid0", 00:14:51.400 "superblock": true, 00:14:51.400 "num_base_bdevs": 3, 00:14:51.400 "num_base_bdevs_discovered": 3, 00:14:51.400 "num_base_bdevs_operational": 3, 00:14:51.400 "base_bdevs_list": [ 00:14:51.400 { 00:14:51.400 "name": "pt1", 00:14:51.400 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:51.400 "is_configured": true, 00:14:51.400 "data_offset": 2048, 00:14:51.400 "data_size": 63488 00:14:51.400 }, 00:14:51.400 { 00:14:51.400 "name": "pt2", 00:14:51.400 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.400 "is_configured": true, 00:14:51.400 "data_offset": 2048, 00:14:51.400 "data_size": 63488 00:14:51.400 }, 00:14:51.400 { 00:14:51.400 "name": "pt3", 00:14:51.400 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:51.400 "is_configured": true, 00:14:51.400 "data_offset": 2048, 00:14:51.400 "data_size": 63488 00:14:51.400 } 00:14:51.400 ] 00:14:51.400 }' 00:14:51.400 15:51:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.400 15:51:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:51.968 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:51.969 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:51.969 [2024-07-12 15:51:12.365389] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:51.969 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:51.969 "name": "raid_bdev1", 00:14:51.969 "aliases": [ 00:14:51.969 "71358bfc-7849-4dc6-b461-697e2eb0a5bb" 00:14:51.969 ], 00:14:51.969 "product_name": "Raid Volume", 00:14:51.969 "block_size": 512, 00:14:51.969 "num_blocks": 190464, 00:14:51.969 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:51.969 "assigned_rate_limits": { 00:14:51.969 "rw_ios_per_sec": 0, 00:14:51.969 "rw_mbytes_per_sec": 0, 00:14:51.969 "r_mbytes_per_sec": 0, 00:14:51.969 "w_mbytes_per_sec": 0 00:14:51.969 }, 00:14:51.969 "claimed": false, 00:14:51.969 "zoned": false, 00:14:51.969 "supported_io_types": { 00:14:51.969 "read": true, 00:14:51.969 "write": true, 00:14:51.969 "unmap": true, 00:14:51.969 "flush": true, 00:14:51.969 "reset": true, 00:14:51.969 "nvme_admin": false, 00:14:51.969 "nvme_io": false, 00:14:51.969 "nvme_io_md": false, 00:14:51.969 "write_zeroes": true, 00:14:51.969 "zcopy": false, 00:14:51.969 "get_zone_info": false, 00:14:51.969 "zone_management": false, 00:14:51.969 "zone_append": false, 00:14:51.969 "compare": false, 00:14:51.969 "compare_and_write": false, 00:14:51.969 "abort": false, 00:14:51.969 "seek_hole": false, 00:14:51.969 "seek_data": false, 00:14:51.969 "copy": false, 00:14:51.969 "nvme_iov_md": false 00:14:51.969 }, 00:14:51.969 "memory_domains": [ 00:14:51.969 { 00:14:51.969 "dma_device_id": "system", 00:14:51.969 "dma_device_type": 1 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.969 "dma_device_type": 2 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "dma_device_id": "system", 00:14:51.969 "dma_device_type": 1 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.969 "dma_device_type": 2 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "dma_device_id": "system", 00:14:51.969 "dma_device_type": 1 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.969 "dma_device_type": 2 00:14:51.969 } 00:14:51.969 ], 00:14:51.969 "driver_specific": { 00:14:51.969 "raid": { 00:14:51.969 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:51.969 "strip_size_kb": 64, 00:14:51.969 "state": "online", 00:14:51.969 "raid_level": "raid0", 00:14:51.969 "superblock": true, 00:14:51.969 "num_base_bdevs": 3, 00:14:51.969 "num_base_bdevs_discovered": 3, 00:14:51.969 "num_base_bdevs_operational": 3, 00:14:51.969 "base_bdevs_list": [ 00:14:51.969 { 00:14:51.969 "name": "pt1", 00:14:51.969 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:51.969 "is_configured": true, 00:14:51.969 "data_offset": 2048, 00:14:51.969 "data_size": 63488 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "name": "pt2", 00:14:51.969 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:51.969 "is_configured": true, 00:14:51.969 "data_offset": 2048, 00:14:51.969 "data_size": 63488 00:14:51.969 }, 00:14:51.969 { 00:14:51.969 "name": "pt3", 00:14:51.969 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:51.969 "is_configured": true, 00:14:51.969 "data_offset": 2048, 00:14:51.969 "data_size": 63488 00:14:51.969 } 00:14:51.969 ] 00:14:51.969 } 00:14:51.969 } 00:14:51.969 }' 00:14:51.969 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:52.230 pt2 00:14:52.230 pt3' 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.230 "name": "pt1", 00:14:52.230 "aliases": [ 00:14:52.230 "00000000-0000-0000-0000-000000000001" 00:14:52.230 ], 00:14:52.230 "product_name": "passthru", 00:14:52.230 "block_size": 512, 00:14:52.230 "num_blocks": 65536, 00:14:52.230 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:52.230 "assigned_rate_limits": { 00:14:52.230 "rw_ios_per_sec": 0, 00:14:52.230 "rw_mbytes_per_sec": 0, 00:14:52.230 "r_mbytes_per_sec": 0, 00:14:52.230 "w_mbytes_per_sec": 0 00:14:52.230 }, 00:14:52.230 "claimed": true, 00:14:52.230 "claim_type": "exclusive_write", 00:14:52.230 "zoned": false, 00:14:52.230 "supported_io_types": { 00:14:52.230 "read": true, 00:14:52.230 "write": true, 00:14:52.230 "unmap": true, 00:14:52.230 "flush": true, 00:14:52.230 "reset": true, 00:14:52.230 "nvme_admin": false, 00:14:52.230 "nvme_io": false, 00:14:52.230 "nvme_io_md": false, 00:14:52.230 "write_zeroes": true, 00:14:52.230 "zcopy": true, 00:14:52.230 "get_zone_info": false, 00:14:52.230 "zone_management": false, 00:14:52.230 "zone_append": false, 00:14:52.230 "compare": false, 00:14:52.230 "compare_and_write": false, 00:14:52.230 "abort": true, 00:14:52.230 "seek_hole": false, 00:14:52.230 "seek_data": false, 00:14:52.230 "copy": true, 00:14:52.230 "nvme_iov_md": false 00:14:52.230 }, 00:14:52.230 "memory_domains": [ 00:14:52.230 { 00:14:52.230 "dma_device_id": "system", 00:14:52.230 "dma_device_type": 1 00:14:52.230 }, 00:14:52.230 { 00:14:52.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.230 "dma_device_type": 2 00:14:52.230 } 00:14:52.230 ], 00:14:52.230 "driver_specific": { 00:14:52.230 "passthru": { 00:14:52.230 "name": "pt1", 00:14:52.230 "base_bdev_name": "malloc1" 00:14:52.230 } 00:14:52.230 } 00:14:52.230 }' 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.230 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.490 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:52.751 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:52.751 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:52.751 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:52.751 15:51:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:52.751 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:52.751 "name": "pt2", 00:14:52.751 "aliases": [ 00:14:52.751 "00000000-0000-0000-0000-000000000002" 00:14:52.751 ], 00:14:52.751 "product_name": "passthru", 00:14:52.751 "block_size": 512, 00:14:52.751 "num_blocks": 65536, 00:14:52.751 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:52.751 "assigned_rate_limits": { 00:14:52.751 "rw_ios_per_sec": 0, 00:14:52.751 "rw_mbytes_per_sec": 0, 00:14:52.751 "r_mbytes_per_sec": 0, 00:14:52.751 "w_mbytes_per_sec": 0 00:14:52.751 }, 00:14:52.751 "claimed": true, 00:14:52.751 "claim_type": "exclusive_write", 00:14:52.751 "zoned": false, 00:14:52.751 "supported_io_types": { 00:14:52.751 "read": true, 00:14:52.751 "write": true, 00:14:52.751 "unmap": true, 00:14:52.751 "flush": true, 00:14:52.751 "reset": true, 00:14:52.751 "nvme_admin": false, 00:14:52.751 "nvme_io": false, 00:14:52.751 "nvme_io_md": false, 00:14:52.751 "write_zeroes": true, 00:14:52.751 "zcopy": true, 00:14:52.751 "get_zone_info": false, 00:14:52.751 "zone_management": false, 00:14:52.751 "zone_append": false, 00:14:52.751 "compare": false, 00:14:52.751 "compare_and_write": false, 00:14:52.751 "abort": true, 00:14:52.751 "seek_hole": false, 00:14:52.751 "seek_data": false, 00:14:52.751 "copy": true, 00:14:52.751 "nvme_iov_md": false 00:14:52.751 }, 00:14:52.751 "memory_domains": [ 00:14:52.751 { 00:14:52.751 "dma_device_id": "system", 00:14:52.751 "dma_device_type": 1 00:14:52.751 }, 00:14:52.751 { 00:14:52.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.751 "dma_device_type": 2 00:14:52.751 } 00:14:52.751 ], 00:14:52.751 "driver_specific": { 00:14:52.751 "passthru": { 00:14:52.751 "name": "pt2", 00:14:52.751 "base_bdev_name": "malloc2" 00:14:52.751 } 00:14:52.751 } 00:14:52.751 }' 00:14:52.751 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:52.751 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.010 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:53.269 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:53.269 "name": "pt3", 00:14:53.269 "aliases": [ 00:14:53.269 "00000000-0000-0000-0000-000000000003" 00:14:53.269 ], 00:14:53.270 "product_name": "passthru", 00:14:53.270 "block_size": 512, 00:14:53.270 "num_blocks": 65536, 00:14:53.270 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:53.270 "assigned_rate_limits": { 00:14:53.270 "rw_ios_per_sec": 0, 00:14:53.270 "rw_mbytes_per_sec": 0, 00:14:53.270 "r_mbytes_per_sec": 0, 00:14:53.270 "w_mbytes_per_sec": 0 00:14:53.270 }, 00:14:53.270 "claimed": true, 00:14:53.270 "claim_type": "exclusive_write", 00:14:53.270 "zoned": false, 00:14:53.270 "supported_io_types": { 00:14:53.270 "read": true, 00:14:53.270 "write": true, 00:14:53.270 "unmap": true, 00:14:53.270 "flush": true, 00:14:53.270 "reset": true, 00:14:53.270 "nvme_admin": false, 00:14:53.270 "nvme_io": false, 00:14:53.270 "nvme_io_md": false, 00:14:53.270 "write_zeroes": true, 00:14:53.270 "zcopy": true, 00:14:53.270 "get_zone_info": false, 00:14:53.270 "zone_management": false, 00:14:53.270 "zone_append": false, 00:14:53.270 "compare": false, 00:14:53.270 "compare_and_write": false, 00:14:53.270 "abort": true, 00:14:53.270 "seek_hole": false, 00:14:53.270 "seek_data": false, 00:14:53.270 "copy": true, 00:14:53.270 "nvme_iov_md": false 00:14:53.270 }, 00:14:53.270 "memory_domains": [ 00:14:53.270 { 00:14:53.270 "dma_device_id": "system", 00:14:53.270 "dma_device_type": 1 00:14:53.270 }, 00:14:53.270 { 00:14:53.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:53.270 "dma_device_type": 2 00:14:53.270 } 00:14:53.270 ], 00:14:53.270 "driver_specific": { 00:14:53.270 "passthru": { 00:14:53.270 "name": "pt3", 00:14:53.270 "base_bdev_name": "malloc3" 00:14:53.270 } 00:14:53.270 } 00:14:53.270 }' 00:14:53.270 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:53.529 15:51:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.789 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:53.789 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:53.789 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:53.789 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:54.049 [2024-07-12 15:51:14.242135] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:54.049 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=71358bfc-7849-4dc6-b461-697e2eb0a5bb 00:14:54.049 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 71358bfc-7849-4dc6-b461-697e2eb0a5bb ']' 00:14:54.049 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:54.049 [2024-07-12 15:51:14.430390] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:54.049 [2024-07-12 15:51:14.430402] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:54.049 [2024-07-12 15:51:14.430439] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:54.049 [2024-07-12 15:51:14.430479] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:54.049 [2024-07-12 15:51:14.430485] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc26cb0 name raid_bdev1, state offline 00:14:54.049 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.049 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:54.309 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:54.309 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:54.309 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:54.309 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:54.568 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:54.568 15:51:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:54.828 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:54.828 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:54.828 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:54.828 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:55.087 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:55.347 [2024-07-12 15:51:15.589280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:55.347 [2024-07-12 15:51:15.590341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:55.347 [2024-07-12 15:51:15.590374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:55.347 [2024-07-12 15:51:15.590407] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:55.347 [2024-07-12 15:51:15.590433] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:55.347 [2024-07-12 15:51:15.590447] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:55.347 [2024-07-12 15:51:15.590457] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:55.347 [2024-07-12 15:51:15.590462] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc32360 name raid_bdev1, state configuring 00:14:55.347 request: 00:14:55.347 { 00:14:55.347 "name": "raid_bdev1", 00:14:55.347 "raid_level": "raid0", 00:14:55.347 "base_bdevs": [ 00:14:55.347 "malloc1", 00:14:55.347 "malloc2", 00:14:55.347 "malloc3" 00:14:55.347 ], 00:14:55.347 "strip_size_kb": 64, 00:14:55.347 "superblock": false, 00:14:55.347 "method": "bdev_raid_create", 00:14:55.347 "req_id": 1 00:14:55.347 } 00:14:55.347 Got JSON-RPC error response 00:14:55.347 response: 00:14:55.347 { 00:14:55.347 "code": -17, 00:14:55.347 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:55.347 } 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.347 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:55.606 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:55.606 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:55.606 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:55.606 [2024-07-12 15:51:15.974204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:55.607 [2024-07-12 15:51:15.974226] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:55.607 [2024-07-12 15:51:15.974236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc29d80 00:14:55.607 [2024-07-12 15:51:15.974242] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:55.607 [2024-07-12 15:51:15.975482] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:55.607 [2024-07-12 15:51:15.975502] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:55.607 [2024-07-12 15:51:15.975543] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:55.607 [2024-07-12 15:51:15.975565] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:55.607 pt1 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.607 15:51:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:55.866 15:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:55.866 "name": "raid_bdev1", 00:14:55.866 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:55.866 "strip_size_kb": 64, 00:14:55.866 "state": "configuring", 00:14:55.866 "raid_level": "raid0", 00:14:55.866 "superblock": true, 00:14:55.866 "num_base_bdevs": 3, 00:14:55.866 "num_base_bdevs_discovered": 1, 00:14:55.866 "num_base_bdevs_operational": 3, 00:14:55.866 "base_bdevs_list": [ 00:14:55.866 { 00:14:55.866 "name": "pt1", 00:14:55.866 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:55.866 "is_configured": true, 00:14:55.866 "data_offset": 2048, 00:14:55.866 "data_size": 63488 00:14:55.866 }, 00:14:55.866 { 00:14:55.866 "name": null, 00:14:55.866 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:55.866 "is_configured": false, 00:14:55.866 "data_offset": 2048, 00:14:55.866 "data_size": 63488 00:14:55.866 }, 00:14:55.866 { 00:14:55.866 "name": null, 00:14:55.866 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:55.866 "is_configured": false, 00:14:55.866 "data_offset": 2048, 00:14:55.866 "data_size": 63488 00:14:55.866 } 00:14:55.866 ] 00:14:55.866 }' 00:14:55.866 15:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:55.866 15:51:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.434 15:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:56.435 15:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:56.435 [2024-07-12 15:51:16.868469] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:56.435 [2024-07-12 15:51:16.868500] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:56.435 [2024-07-12 15:51:16.868511] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc270f0 00:14:56.435 [2024-07-12 15:51:16.868517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:56.435 [2024-07-12 15:51:16.868777] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:56.435 [2024-07-12 15:51:16.868787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:56.435 [2024-07-12 15:51:16.868829] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:56.435 [2024-07-12 15:51:16.868841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:56.435 pt2 00:14:56.694 15:51:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:56.694 [2024-07-12 15:51:17.056949] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:56.694 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:56.694 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.695 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:56.955 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.955 "name": "raid_bdev1", 00:14:56.955 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:56.955 "strip_size_kb": 64, 00:14:56.955 "state": "configuring", 00:14:56.955 "raid_level": "raid0", 00:14:56.955 "superblock": true, 00:14:56.955 "num_base_bdevs": 3, 00:14:56.955 "num_base_bdevs_discovered": 1, 00:14:56.955 "num_base_bdevs_operational": 3, 00:14:56.955 "base_bdevs_list": [ 00:14:56.955 { 00:14:56.955 "name": "pt1", 00:14:56.955 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:56.955 "is_configured": true, 00:14:56.955 "data_offset": 2048, 00:14:56.955 "data_size": 63488 00:14:56.955 }, 00:14:56.955 { 00:14:56.955 "name": null, 00:14:56.955 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:56.955 "is_configured": false, 00:14:56.955 "data_offset": 2048, 00:14:56.955 "data_size": 63488 00:14:56.955 }, 00:14:56.955 { 00:14:56.955 "name": null, 00:14:56.955 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:56.955 "is_configured": false, 00:14:56.955 "data_offset": 2048, 00:14:56.955 "data_size": 63488 00:14:56.955 } 00:14:56.955 ] 00:14:56.955 }' 00:14:56.955 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.955 15:51:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.524 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:57.524 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:57.524 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:57.524 [2024-07-12 15:51:17.959228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:57.524 [2024-07-12 15:51:17.959257] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.524 [2024-07-12 15:51:17.959266] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc242b0 00:14:57.524 [2024-07-12 15:51:17.959272] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.524 [2024-07-12 15:51:17.959520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.524 [2024-07-12 15:51:17.959530] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:57.524 [2024-07-12 15:51:17.959568] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:57.524 [2024-07-12 15:51:17.959579] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:57.524 pt2 00:14:57.784 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:57.784 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:57.784 15:51:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:57.784 [2024-07-12 15:51:18.143691] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:57.784 [2024-07-12 15:51:18.143717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:57.784 [2024-07-12 15:51:18.143727] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc24540 00:14:57.784 [2024-07-12 15:51:18.143733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:57.784 [2024-07-12 15:51:18.143940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:57.784 [2024-07-12 15:51:18.143949] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:57.784 [2024-07-12 15:51:18.143981] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:57.784 [2024-07-12 15:51:18.143991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:57.784 [2024-07-12 15:51:18.144068] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc25460 00:14:57.784 [2024-07-12 15:51:18.144073] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:57.784 [2024-07-12 15:51:18.144204] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc32a70 00:14:57.784 [2024-07-12 15:51:18.144303] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc25460 00:14:57.784 [2024-07-12 15:51:18.144308] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc25460 00:14:57.784 [2024-07-12 15:51:18.144378] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:57.784 pt3 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.784 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:58.043 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:58.043 "name": "raid_bdev1", 00:14:58.043 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:58.043 "strip_size_kb": 64, 00:14:58.043 "state": "online", 00:14:58.043 "raid_level": "raid0", 00:14:58.043 "superblock": true, 00:14:58.043 "num_base_bdevs": 3, 00:14:58.043 "num_base_bdevs_discovered": 3, 00:14:58.043 "num_base_bdevs_operational": 3, 00:14:58.043 "base_bdevs_list": [ 00:14:58.043 { 00:14:58.043 "name": "pt1", 00:14:58.043 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:58.043 "is_configured": true, 00:14:58.043 "data_offset": 2048, 00:14:58.043 "data_size": 63488 00:14:58.043 }, 00:14:58.043 { 00:14:58.043 "name": "pt2", 00:14:58.043 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:58.043 "is_configured": true, 00:14:58.043 "data_offset": 2048, 00:14:58.043 "data_size": 63488 00:14:58.043 }, 00:14:58.043 { 00:14:58.043 "name": "pt3", 00:14:58.043 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:58.043 "is_configured": true, 00:14:58.043 "data_offset": 2048, 00:14:58.043 "data_size": 63488 00:14:58.043 } 00:14:58.043 ] 00:14:58.043 }' 00:14:58.043 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:58.043 15:51:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:58.611 15:51:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:58.870 [2024-07-12 15:51:19.066261] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:58.870 "name": "raid_bdev1", 00:14:58.870 "aliases": [ 00:14:58.870 "71358bfc-7849-4dc6-b461-697e2eb0a5bb" 00:14:58.870 ], 00:14:58.870 "product_name": "Raid Volume", 00:14:58.870 "block_size": 512, 00:14:58.870 "num_blocks": 190464, 00:14:58.870 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:58.870 "assigned_rate_limits": { 00:14:58.870 "rw_ios_per_sec": 0, 00:14:58.870 "rw_mbytes_per_sec": 0, 00:14:58.870 "r_mbytes_per_sec": 0, 00:14:58.870 "w_mbytes_per_sec": 0 00:14:58.870 }, 00:14:58.870 "claimed": false, 00:14:58.870 "zoned": false, 00:14:58.870 "supported_io_types": { 00:14:58.870 "read": true, 00:14:58.870 "write": true, 00:14:58.870 "unmap": true, 00:14:58.870 "flush": true, 00:14:58.870 "reset": true, 00:14:58.870 "nvme_admin": false, 00:14:58.870 "nvme_io": false, 00:14:58.870 "nvme_io_md": false, 00:14:58.870 "write_zeroes": true, 00:14:58.870 "zcopy": false, 00:14:58.870 "get_zone_info": false, 00:14:58.870 "zone_management": false, 00:14:58.870 "zone_append": false, 00:14:58.870 "compare": false, 00:14:58.870 "compare_and_write": false, 00:14:58.870 "abort": false, 00:14:58.870 "seek_hole": false, 00:14:58.870 "seek_data": false, 00:14:58.870 "copy": false, 00:14:58.870 "nvme_iov_md": false 00:14:58.870 }, 00:14:58.870 "memory_domains": [ 00:14:58.870 { 00:14:58.870 "dma_device_id": "system", 00:14:58.870 "dma_device_type": 1 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.870 "dma_device_type": 2 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "system", 00:14:58.870 "dma_device_type": 1 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.870 "dma_device_type": 2 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "system", 00:14:58.870 "dma_device_type": 1 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.870 "dma_device_type": 2 00:14:58.870 } 00:14:58.870 ], 00:14:58.870 "driver_specific": { 00:14:58.870 "raid": { 00:14:58.870 "uuid": "71358bfc-7849-4dc6-b461-697e2eb0a5bb", 00:14:58.870 "strip_size_kb": 64, 00:14:58.870 "state": "online", 00:14:58.870 "raid_level": "raid0", 00:14:58.870 "superblock": true, 00:14:58.870 "num_base_bdevs": 3, 00:14:58.870 "num_base_bdevs_discovered": 3, 00:14:58.870 "num_base_bdevs_operational": 3, 00:14:58.870 "base_bdevs_list": [ 00:14:58.870 { 00:14:58.870 "name": "pt1", 00:14:58.870 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:58.870 "is_configured": true, 00:14:58.870 "data_offset": 2048, 00:14:58.870 "data_size": 63488 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "name": "pt2", 00:14:58.870 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:58.870 "is_configured": true, 00:14:58.870 "data_offset": 2048, 00:14:58.870 "data_size": 63488 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "name": "pt3", 00:14:58.870 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:58.870 "is_configured": true, 00:14:58.870 "data_offset": 2048, 00:14:58.870 "data_size": 63488 00:14:58.870 } 00:14:58.870 ] 00:14:58.870 } 00:14:58.870 } 00:14:58.870 }' 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:58.870 pt2 00:14:58.870 pt3' 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.870 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.870 "name": "pt1", 00:14:58.870 "aliases": [ 00:14:58.870 "00000000-0000-0000-0000-000000000001" 00:14:58.870 ], 00:14:58.870 "product_name": "passthru", 00:14:58.870 "block_size": 512, 00:14:58.870 "num_blocks": 65536, 00:14:58.870 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:58.870 "assigned_rate_limits": { 00:14:58.870 "rw_ios_per_sec": 0, 00:14:58.870 "rw_mbytes_per_sec": 0, 00:14:58.870 "r_mbytes_per_sec": 0, 00:14:58.870 "w_mbytes_per_sec": 0 00:14:58.870 }, 00:14:58.870 "claimed": true, 00:14:58.870 "claim_type": "exclusive_write", 00:14:58.870 "zoned": false, 00:14:58.870 "supported_io_types": { 00:14:58.870 "read": true, 00:14:58.870 "write": true, 00:14:58.870 "unmap": true, 00:14:58.870 "flush": true, 00:14:58.870 "reset": true, 00:14:58.870 "nvme_admin": false, 00:14:58.870 "nvme_io": false, 00:14:58.870 "nvme_io_md": false, 00:14:58.870 "write_zeroes": true, 00:14:58.870 "zcopy": true, 00:14:58.870 "get_zone_info": false, 00:14:58.870 "zone_management": false, 00:14:58.870 "zone_append": false, 00:14:58.870 "compare": false, 00:14:58.870 "compare_and_write": false, 00:14:58.870 "abort": true, 00:14:58.870 "seek_hole": false, 00:14:58.870 "seek_data": false, 00:14:58.870 "copy": true, 00:14:58.870 "nvme_iov_md": false 00:14:58.870 }, 00:14:58.870 "memory_domains": [ 00:14:58.870 { 00:14:58.870 "dma_device_id": "system", 00:14:58.870 "dma_device_type": 1 00:14:58.870 }, 00:14:58.870 { 00:14:58.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.871 "dma_device_type": 2 00:14:58.871 } 00:14:58.871 ], 00:14:58.871 "driver_specific": { 00:14:58.871 "passthru": { 00:14:58.871 "name": "pt1", 00:14:58.871 "base_bdev_name": "malloc1" 00:14:58.871 } 00:14:58.871 } 00:14:58.871 }' 00:14:58.871 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.130 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:59.390 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:59.649 "name": "pt2", 00:14:59.649 "aliases": [ 00:14:59.649 "00000000-0000-0000-0000-000000000002" 00:14:59.649 ], 00:14:59.649 "product_name": "passthru", 00:14:59.649 "block_size": 512, 00:14:59.649 "num_blocks": 65536, 00:14:59.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:59.649 "assigned_rate_limits": { 00:14:59.649 "rw_ios_per_sec": 0, 00:14:59.649 "rw_mbytes_per_sec": 0, 00:14:59.649 "r_mbytes_per_sec": 0, 00:14:59.649 "w_mbytes_per_sec": 0 00:14:59.649 }, 00:14:59.649 "claimed": true, 00:14:59.649 "claim_type": "exclusive_write", 00:14:59.649 "zoned": false, 00:14:59.649 "supported_io_types": { 00:14:59.649 "read": true, 00:14:59.649 "write": true, 00:14:59.649 "unmap": true, 00:14:59.649 "flush": true, 00:14:59.649 "reset": true, 00:14:59.649 "nvme_admin": false, 00:14:59.649 "nvme_io": false, 00:14:59.649 "nvme_io_md": false, 00:14:59.649 "write_zeroes": true, 00:14:59.649 "zcopy": true, 00:14:59.649 "get_zone_info": false, 00:14:59.649 "zone_management": false, 00:14:59.649 "zone_append": false, 00:14:59.649 "compare": false, 00:14:59.649 "compare_and_write": false, 00:14:59.649 "abort": true, 00:14:59.649 "seek_hole": false, 00:14:59.649 "seek_data": false, 00:14:59.649 "copy": true, 00:14:59.649 "nvme_iov_md": false 00:14:59.649 }, 00:14:59.649 "memory_domains": [ 00:14:59.649 { 00:14:59.649 "dma_device_id": "system", 00:14:59.649 "dma_device_type": 1 00:14:59.649 }, 00:14:59.649 { 00:14:59.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:59.649 "dma_device_type": 2 00:14:59.649 } 00:14:59.649 ], 00:14:59.649 "driver_specific": { 00:14:59.649 "passthru": { 00:14:59.649 "name": "pt2", 00:14:59.649 "base_bdev_name": "malloc2" 00:14:59.649 } 00:14:59.649 } 00:14:59.649 }' 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.649 15:51:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:59.649 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:59.649 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.649 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:59.919 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:00.202 "name": "pt3", 00:15:00.202 "aliases": [ 00:15:00.202 "00000000-0000-0000-0000-000000000003" 00:15:00.202 ], 00:15:00.202 "product_name": "passthru", 00:15:00.202 "block_size": 512, 00:15:00.202 "num_blocks": 65536, 00:15:00.202 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:00.202 "assigned_rate_limits": { 00:15:00.202 "rw_ios_per_sec": 0, 00:15:00.202 "rw_mbytes_per_sec": 0, 00:15:00.202 "r_mbytes_per_sec": 0, 00:15:00.202 "w_mbytes_per_sec": 0 00:15:00.202 }, 00:15:00.202 "claimed": true, 00:15:00.202 "claim_type": "exclusive_write", 00:15:00.202 "zoned": false, 00:15:00.202 "supported_io_types": { 00:15:00.202 "read": true, 00:15:00.202 "write": true, 00:15:00.202 "unmap": true, 00:15:00.202 "flush": true, 00:15:00.202 "reset": true, 00:15:00.202 "nvme_admin": false, 00:15:00.202 "nvme_io": false, 00:15:00.202 "nvme_io_md": false, 00:15:00.202 "write_zeroes": true, 00:15:00.202 "zcopy": true, 00:15:00.202 "get_zone_info": false, 00:15:00.202 "zone_management": false, 00:15:00.202 "zone_append": false, 00:15:00.202 "compare": false, 00:15:00.202 "compare_and_write": false, 00:15:00.202 "abort": true, 00:15:00.202 "seek_hole": false, 00:15:00.202 "seek_data": false, 00:15:00.202 "copy": true, 00:15:00.202 "nvme_iov_md": false 00:15:00.202 }, 00:15:00.202 "memory_domains": [ 00:15:00.202 { 00:15:00.202 "dma_device_id": "system", 00:15:00.202 "dma_device_type": 1 00:15:00.202 }, 00:15:00.202 { 00:15:00.202 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:00.202 "dma_device_type": 2 00:15:00.202 } 00:15:00.202 ], 00:15:00.202 "driver_specific": { 00:15:00.202 "passthru": { 00:15:00.202 "name": "pt3", 00:15:00.202 "base_bdev_name": "malloc3" 00:15:00.202 } 00:15:00.202 } 00:15:00.202 }' 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.202 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:00.203 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:00.203 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.203 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:00.203 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:00.203 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.463 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:00.463 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:00.463 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:00.463 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:00.463 [2024-07-12 15:51:20.906934] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 71358bfc-7849-4dc6-b461-697e2eb0a5bb '!=' 71358bfc-7849-4dc6-b461-697e2eb0a5bb ']' 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2531729 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2531729 ']' 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2531729 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2531729 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2531729' 00:15:00.723 killing process with pid 2531729 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2531729 00:15:00.723 [2024-07-12 15:51:20.972815] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:00.723 [2024-07-12 15:51:20.972857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:00.723 [2024-07-12 15:51:20.972900] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:00.723 [2024-07-12 15:51:20.972906] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc25460 name raid_bdev1, state offline 00:15:00.723 15:51:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2531729 00:15:00.723 [2024-07-12 15:51:20.987762] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:00.723 15:51:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:00.723 00:15:00.723 real 0m11.863s 00:15:00.723 user 0m21.802s 00:15:00.723 sys 0m1.753s 00:15:00.723 15:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:00.723 15:51:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.723 ************************************ 00:15:00.723 END TEST raid_superblock_test 00:15:00.723 ************************************ 00:15:00.723 15:51:21 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:00.723 15:51:21 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:15:00.723 15:51:21 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:00.723 15:51:21 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:00.723 15:51:21 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:00.984 ************************************ 00:15:00.984 START TEST raid_read_error_test 00:15:00.984 ************************************ 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FRJWAgh2OW 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2534403 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2534403 /var/tmp/spdk-raid.sock 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2534403 ']' 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:00.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:00.984 15:51:21 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.984 [2024-07-12 15:51:21.304242] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:15:00.984 [2024-07-12 15:51:21.304367] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2534403 ] 00:15:01.244 [2024-07-12 15:51:21.446138] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:01.244 [2024-07-12 15:51:21.523446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:01.244 [2024-07-12 15:51:21.574992] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.244 [2024-07-12 15:51:21.575019] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:01.815 15:51:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:01.815 15:51:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:01.815 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:01.815 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:02.075 BaseBdev1_malloc 00:15:02.075 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:02.075 true 00:15:02.075 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:02.335 [2024-07-12 15:51:22.650160] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:02.335 [2024-07-12 15:51:22.650195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.335 [2024-07-12 15:51:22.650207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e6aa0 00:15:02.335 [2024-07-12 15:51:22.650213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.335 [2024-07-12 15:51:22.651491] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.335 [2024-07-12 15:51:22.651510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:02.335 BaseBdev1 00:15:02.335 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:02.335 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:02.595 BaseBdev2_malloc 00:15:02.595 15:51:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:02.595 true 00:15:02.876 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:02.876 [2024-07-12 15:51:23.221544] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:02.876 [2024-07-12 15:51:23.221570] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:02.876 [2024-07-12 15:51:23.221581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ebe40 00:15:02.876 [2024-07-12 15:51:23.221587] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:02.876 [2024-07-12 15:51:23.222799] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:02.876 [2024-07-12 15:51:23.222818] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:02.876 BaseBdev2 00:15:02.876 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:02.876 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:03.136 BaseBdev3_malloc 00:15:03.136 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:03.396 true 00:15:03.396 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:03.396 [2024-07-12 15:51:23.776802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:03.396 [2024-07-12 15:51:23.776829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:03.396 [2024-07-12 15:51:23.776841] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ed7f0 00:15:03.396 [2024-07-12 15:51:23.776848] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:03.396 [2024-07-12 15:51:23.778046] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:03.396 [2024-07-12 15:51:23.778065] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:03.396 BaseBdev3 00:15:03.396 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:03.657 [2024-07-12 15:51:23.953270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:03.657 [2024-07-12 15:51:23.954265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:03.657 [2024-07-12 15:51:23.954318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:03.657 [2024-07-12 15:51:23.954468] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11eb750 00:15:03.657 [2024-07-12 15:51:23.954478] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:03.657 [2024-07-12 15:51:23.954621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ee970 00:15:03.657 [2024-07-12 15:51:23.954741] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11eb750 00:15:03.657 [2024-07-12 15:51:23.954747] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11eb750 00:15:03.657 [2024-07-12 15:51:23.954821] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.657 15:51:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:03.918 15:51:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.918 "name": "raid_bdev1", 00:15:03.918 "uuid": "50e08f44-4cca-470a-a1a9-1eca76147c5f", 00:15:03.918 "strip_size_kb": 64, 00:15:03.918 "state": "online", 00:15:03.918 "raid_level": "raid0", 00:15:03.918 "superblock": true, 00:15:03.918 "num_base_bdevs": 3, 00:15:03.918 "num_base_bdevs_discovered": 3, 00:15:03.918 "num_base_bdevs_operational": 3, 00:15:03.918 "base_bdevs_list": [ 00:15:03.918 { 00:15:03.918 "name": "BaseBdev1", 00:15:03.918 "uuid": "f9847033-ebdd-50b1-959f-b6e0188c1eaa", 00:15:03.918 "is_configured": true, 00:15:03.918 "data_offset": 2048, 00:15:03.918 "data_size": 63488 00:15:03.918 }, 00:15:03.918 { 00:15:03.918 "name": "BaseBdev2", 00:15:03.918 "uuid": "9e39ef38-1e7d-5041-8157-a69851d520a3", 00:15:03.918 "is_configured": true, 00:15:03.918 "data_offset": 2048, 00:15:03.918 "data_size": 63488 00:15:03.918 }, 00:15:03.918 { 00:15:03.918 "name": "BaseBdev3", 00:15:03.918 "uuid": "7de59713-ffc8-5567-b70b-18c37ea38c55", 00:15:03.918 "is_configured": true, 00:15:03.918 "data_offset": 2048, 00:15:03.918 "data_size": 63488 00:15:03.918 } 00:15:03.918 ] 00:15:03.918 }' 00:15:03.918 15:51:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.918 15:51:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.487 15:51:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:04.487 15:51:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:04.487 [2024-07-12 15:51:24.887860] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10422f0 00:15:05.426 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:05.686 15:51:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.945 15:51:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:05.945 "name": "raid_bdev1", 00:15:05.945 "uuid": "50e08f44-4cca-470a-a1a9-1eca76147c5f", 00:15:05.945 "strip_size_kb": 64, 00:15:05.945 "state": "online", 00:15:05.945 "raid_level": "raid0", 00:15:05.945 "superblock": true, 00:15:05.945 "num_base_bdevs": 3, 00:15:05.945 "num_base_bdevs_discovered": 3, 00:15:05.945 "num_base_bdevs_operational": 3, 00:15:05.945 "base_bdevs_list": [ 00:15:05.945 { 00:15:05.945 "name": "BaseBdev1", 00:15:05.945 "uuid": "f9847033-ebdd-50b1-959f-b6e0188c1eaa", 00:15:05.945 "is_configured": true, 00:15:05.945 "data_offset": 2048, 00:15:05.945 "data_size": 63488 00:15:05.945 }, 00:15:05.945 { 00:15:05.945 "name": "BaseBdev2", 00:15:05.945 "uuid": "9e39ef38-1e7d-5041-8157-a69851d520a3", 00:15:05.945 "is_configured": true, 00:15:05.945 "data_offset": 2048, 00:15:05.945 "data_size": 63488 00:15:05.945 }, 00:15:05.945 { 00:15:05.945 "name": "BaseBdev3", 00:15:05.945 "uuid": "7de59713-ffc8-5567-b70b-18c37ea38c55", 00:15:05.945 "is_configured": true, 00:15:05.945 "data_offset": 2048, 00:15:05.945 "data_size": 63488 00:15:05.945 } 00:15:05.945 ] 00:15:05.945 }' 00:15:05.945 15:51:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:05.945 15:51:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:06.884 [2024-07-12 15:51:27.268599] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:06.884 [2024-07-12 15:51:27.268631] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.884 [2024-07-12 15:51:27.271212] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.884 [2024-07-12 15:51:27.271238] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:06.884 [2024-07-12 15:51:27.271263] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.884 [2024-07-12 15:51:27.271269] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11eb750 name raid_bdev1, state offline 00:15:06.884 0 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2534403 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2534403 ']' 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2534403 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:06.884 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2534403 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2534403' 00:15:07.145 killing process with pid 2534403 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2534403 00:15:07.145 [2024-07-12 15:51:27.339423] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2534403 00:15:07.145 [2024-07-12 15:51:27.350394] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FRJWAgh2OW 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:15:07.145 00:15:07.145 real 0m6.294s 00:15:07.145 user 0m10.122s 00:15:07.145 sys 0m0.920s 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:07.145 15:51:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.145 ************************************ 00:15:07.145 END TEST raid_read_error_test 00:15:07.145 ************************************ 00:15:07.145 15:51:27 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:07.145 15:51:27 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:15:07.145 15:51:27 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:07.145 15:51:27 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:07.145 15:51:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:07.145 ************************************ 00:15:07.145 START TEST raid_write_error_test 00:15:07.145 ************************************ 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Ac4lCNu9pZ 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2535552 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2535552 /var/tmp/spdk-raid.sock 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2535552 ']' 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:07.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:07.145 15:51:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.405 [2024-07-12 15:51:27.641068] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:15:07.405 [2024-07-12 15:51:27.641126] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2535552 ] 00:15:07.405 [2024-07-12 15:51:27.732426] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:07.405 [2024-07-12 15:51:27.807924] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:07.665 [2024-07-12 15:51:27.855873] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.665 [2024-07-12 15:51:27.855900] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.924 15:51:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:07.924 15:51:28 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:07.924 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:07.925 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:07.925 BaseBdev1_malloc 00:15:07.925 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:08.184 true 00:15:08.184 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:08.444 [2024-07-12 15:51:28.714929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:08.444 [2024-07-12 15:51:28.714959] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.444 [2024-07-12 15:51:28.714971] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f4aa0 00:15:08.444 [2024-07-12 15:51:28.714977] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.444 [2024-07-12 15:51:28.716245] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.444 [2024-07-12 15:51:28.716264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:08.444 BaseBdev1 00:15:08.444 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:08.444 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:08.704 BaseBdev2_malloc 00:15:08.704 15:51:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:08.704 true 00:15:08.704 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:08.964 [2024-07-12 15:51:29.258177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:08.964 [2024-07-12 15:51:29.258205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:08.964 [2024-07-12 15:51:29.258217] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12f9e40 00:15:08.964 [2024-07-12 15:51:29.258223] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:08.964 [2024-07-12 15:51:29.259403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:08.964 [2024-07-12 15:51:29.259421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:08.964 BaseBdev2 00:15:08.964 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:08.964 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:09.224 BaseBdev3_malloc 00:15:09.224 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:09.224 true 00:15:09.224 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:09.484 [2024-07-12 15:51:29.829521] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:09.484 [2024-07-12 15:51:29.829549] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:09.484 [2024-07-12 15:51:29.829561] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12fb7f0 00:15:09.484 [2024-07-12 15:51:29.829567] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:09.484 [2024-07-12 15:51:29.830761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:09.484 [2024-07-12 15:51:29.830779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:09.484 BaseBdev3 00:15:09.484 15:51:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:09.743 [2024-07-12 15:51:30.005995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:09.743 [2024-07-12 15:51:30.007012] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:09.743 [2024-07-12 15:51:30.007066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:09.743 [2024-07-12 15:51:30.007219] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12f9750 00:15:09.743 [2024-07-12 15:51:30.007227] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:09.743 [2024-07-12 15:51:30.007379] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12fc970 00:15:09.743 [2024-07-12 15:51:30.007493] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12f9750 00:15:09.743 [2024-07-12 15:51:30.007499] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12f9750 00:15:09.743 [2024-07-12 15:51:30.007574] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.743 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:10.002 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.002 "name": "raid_bdev1", 00:15:10.002 "uuid": "2db96dbf-d29e-4d4f-bbe1-aaf56cd50973", 00:15:10.002 "strip_size_kb": 64, 00:15:10.002 "state": "online", 00:15:10.002 "raid_level": "raid0", 00:15:10.002 "superblock": true, 00:15:10.002 "num_base_bdevs": 3, 00:15:10.002 "num_base_bdevs_discovered": 3, 00:15:10.002 "num_base_bdevs_operational": 3, 00:15:10.002 "base_bdevs_list": [ 00:15:10.002 { 00:15:10.002 "name": "BaseBdev1", 00:15:10.002 "uuid": "ddfc2787-f290-5f5f-9c82-7732e86e2aa0", 00:15:10.002 "is_configured": true, 00:15:10.002 "data_offset": 2048, 00:15:10.002 "data_size": 63488 00:15:10.002 }, 00:15:10.002 { 00:15:10.002 "name": "BaseBdev2", 00:15:10.002 "uuid": "9ab3791d-a6fb-58f9-8e1f-9185104d440c", 00:15:10.002 "is_configured": true, 00:15:10.002 "data_offset": 2048, 00:15:10.002 "data_size": 63488 00:15:10.003 }, 00:15:10.003 { 00:15:10.003 "name": "BaseBdev3", 00:15:10.003 "uuid": "efcfc080-c0d0-5d0e-901a-38e3fae2598d", 00:15:10.003 "is_configured": true, 00:15:10.003 "data_offset": 2048, 00:15:10.003 "data_size": 63488 00:15:10.003 } 00:15:10.003 ] 00:15:10.003 }' 00:15:10.003 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.003 15:51:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:10.573 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:10.573 15:51:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:10.573 [2024-07-12 15:51:30.840311] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11502f0 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.513 15:51:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:11.774 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.774 "name": "raid_bdev1", 00:15:11.774 "uuid": "2db96dbf-d29e-4d4f-bbe1-aaf56cd50973", 00:15:11.774 "strip_size_kb": 64, 00:15:11.774 "state": "online", 00:15:11.774 "raid_level": "raid0", 00:15:11.774 "superblock": true, 00:15:11.774 "num_base_bdevs": 3, 00:15:11.774 "num_base_bdevs_discovered": 3, 00:15:11.774 "num_base_bdevs_operational": 3, 00:15:11.774 "base_bdevs_list": [ 00:15:11.774 { 00:15:11.774 "name": "BaseBdev1", 00:15:11.774 "uuid": "ddfc2787-f290-5f5f-9c82-7732e86e2aa0", 00:15:11.774 "is_configured": true, 00:15:11.774 "data_offset": 2048, 00:15:11.774 "data_size": 63488 00:15:11.774 }, 00:15:11.774 { 00:15:11.774 "name": "BaseBdev2", 00:15:11.774 "uuid": "9ab3791d-a6fb-58f9-8e1f-9185104d440c", 00:15:11.774 "is_configured": true, 00:15:11.774 "data_offset": 2048, 00:15:11.774 "data_size": 63488 00:15:11.774 }, 00:15:11.774 { 00:15:11.774 "name": "BaseBdev3", 00:15:11.774 "uuid": "efcfc080-c0d0-5d0e-901a-38e3fae2598d", 00:15:11.774 "is_configured": true, 00:15:11.774 "data_offset": 2048, 00:15:11.774 "data_size": 63488 00:15:11.774 } 00:15:11.774 ] 00:15:11.774 }' 00:15:11.774 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.774 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.343 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:12.343 [2024-07-12 15:51:32.770231] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:12.343 [2024-07-12 15:51:32.770264] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:12.343 [2024-07-12 15:51:32.772850] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:12.343 [2024-07-12 15:51:32.772876] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:12.343 [2024-07-12 15:51:32.772902] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:12.343 [2024-07-12 15:51:32.772909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12f9750 name raid_bdev1, state offline 00:15:12.343 0 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2535552 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2535552 ']' 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2535552 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2535552 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2535552' 00:15:12.603 killing process with pid 2535552 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2535552 00:15:12.603 [2024-07-12 15:51:32.855036] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2535552 00:15:12.603 [2024-07-12 15:51:32.866216] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Ac4lCNu9pZ 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.52 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.52 != \0\.\0\0 ]] 00:15:12.603 00:15:12.603 real 0m5.429s 00:15:12.603 user 0m8.962s 00:15:12.603 sys 0m0.820s 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:12.603 15:51:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.603 ************************************ 00:15:12.603 END TEST raid_write_error_test 00:15:12.603 ************************************ 00:15:12.603 15:51:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:12.603 15:51:33 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:15:12.603 15:51:33 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:15:12.603 15:51:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:12.603 15:51:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:12.603 15:51:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:12.865 ************************************ 00:15:12.865 START TEST raid_state_function_test 00:15:12.865 ************************************ 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2536570 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2536570' 00:15:12.865 Process raid pid: 2536570 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2536570 /var/tmp/spdk-raid.sock 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2536570 ']' 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:12.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:12.865 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.865 [2024-07-12 15:51:33.138586] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:15:12.865 [2024-07-12 15:51:33.138641] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:12.865 [2024-07-12 15:51:33.231307] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:12.865 [2024-07-12 15:51:33.299619] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:13.125 [2024-07-12 15:51:33.350602] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:13.125 [2024-07-12 15:51:33.350626] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:13.695 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:13.695 15:51:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:15:13.695 15:51:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:13.954 [2024-07-12 15:51:34.154474] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:13.954 [2024-07-12 15:51:34.154504] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:13.954 [2024-07-12 15:51:34.154510] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:13.954 [2024-07-12 15:51:34.154515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:13.954 [2024-07-12 15:51:34.154520] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:13.954 [2024-07-12 15:51:34.154525] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.954 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.955 "name": "Existed_Raid", 00:15:13.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.955 "strip_size_kb": 64, 00:15:13.955 "state": "configuring", 00:15:13.955 "raid_level": "concat", 00:15:13.955 "superblock": false, 00:15:13.955 "num_base_bdevs": 3, 00:15:13.955 "num_base_bdevs_discovered": 0, 00:15:13.955 "num_base_bdevs_operational": 3, 00:15:13.955 "base_bdevs_list": [ 00:15:13.955 { 00:15:13.955 "name": "BaseBdev1", 00:15:13.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.955 "is_configured": false, 00:15:13.955 "data_offset": 0, 00:15:13.955 "data_size": 0 00:15:13.955 }, 00:15:13.955 { 00:15:13.955 "name": "BaseBdev2", 00:15:13.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.955 "is_configured": false, 00:15:13.955 "data_offset": 0, 00:15:13.955 "data_size": 0 00:15:13.955 }, 00:15:13.955 { 00:15:13.955 "name": "BaseBdev3", 00:15:13.955 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.955 "is_configured": false, 00:15:13.955 "data_offset": 0, 00:15:13.955 "data_size": 0 00:15:13.955 } 00:15:13.955 ] 00:15:13.955 }' 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.955 15:51:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:15.336 15:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:15.336 [2024-07-12 15:51:35.614035] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:15.336 [2024-07-12 15:51:35.614056] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f7900 name Existed_Raid, state configuring 00:15:15.336 15:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:15.595 [2024-07-12 15:51:35.806533] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:15.595 [2024-07-12 15:51:35.806549] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:15.595 [2024-07-12 15:51:35.806554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:15.595 [2024-07-12 15:51:35.806560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:15.595 [2024-07-12 15:51:35.806565] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:15.595 [2024-07-12 15:51:35.806570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:15.595 15:51:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:15.596 [2024-07-12 15:51:36.001669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:15.596 BaseBdev1 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:15.596 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.855 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:16.115 [ 00:15:16.115 { 00:15:16.115 "name": "BaseBdev1", 00:15:16.115 "aliases": [ 00:15:16.115 "95111050-5311-492e-916d-0a210743f7e7" 00:15:16.115 ], 00:15:16.115 "product_name": "Malloc disk", 00:15:16.115 "block_size": 512, 00:15:16.115 "num_blocks": 65536, 00:15:16.115 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:16.115 "assigned_rate_limits": { 00:15:16.115 "rw_ios_per_sec": 0, 00:15:16.115 "rw_mbytes_per_sec": 0, 00:15:16.115 "r_mbytes_per_sec": 0, 00:15:16.115 "w_mbytes_per_sec": 0 00:15:16.115 }, 00:15:16.115 "claimed": true, 00:15:16.115 "claim_type": "exclusive_write", 00:15:16.115 "zoned": false, 00:15:16.115 "supported_io_types": { 00:15:16.115 "read": true, 00:15:16.115 "write": true, 00:15:16.115 "unmap": true, 00:15:16.115 "flush": true, 00:15:16.115 "reset": true, 00:15:16.115 "nvme_admin": false, 00:15:16.115 "nvme_io": false, 00:15:16.115 "nvme_io_md": false, 00:15:16.115 "write_zeroes": true, 00:15:16.115 "zcopy": true, 00:15:16.115 "get_zone_info": false, 00:15:16.115 "zone_management": false, 00:15:16.115 "zone_append": false, 00:15:16.115 "compare": false, 00:15:16.115 "compare_and_write": false, 00:15:16.115 "abort": true, 00:15:16.115 "seek_hole": false, 00:15:16.115 "seek_data": false, 00:15:16.115 "copy": true, 00:15:16.115 "nvme_iov_md": false 00:15:16.115 }, 00:15:16.115 "memory_domains": [ 00:15:16.115 { 00:15:16.115 "dma_device_id": "system", 00:15:16.115 "dma_device_type": 1 00:15:16.115 }, 00:15:16.115 { 00:15:16.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.115 "dma_device_type": 2 00:15:16.115 } 00:15:16.115 ], 00:15:16.115 "driver_specific": {} 00:15:16.115 } 00:15:16.115 ] 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.115 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.375 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.375 "name": "Existed_Raid", 00:15:16.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.375 "strip_size_kb": 64, 00:15:16.375 "state": "configuring", 00:15:16.375 "raid_level": "concat", 00:15:16.375 "superblock": false, 00:15:16.375 "num_base_bdevs": 3, 00:15:16.375 "num_base_bdevs_discovered": 1, 00:15:16.375 "num_base_bdevs_operational": 3, 00:15:16.375 "base_bdevs_list": [ 00:15:16.375 { 00:15:16.375 "name": "BaseBdev1", 00:15:16.375 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:16.375 "is_configured": true, 00:15:16.375 "data_offset": 0, 00:15:16.375 "data_size": 65536 00:15:16.375 }, 00:15:16.375 { 00:15:16.375 "name": "BaseBdev2", 00:15:16.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.375 "is_configured": false, 00:15:16.375 "data_offset": 0, 00:15:16.376 "data_size": 0 00:15:16.376 }, 00:15:16.376 { 00:15:16.376 "name": "BaseBdev3", 00:15:16.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:16.376 "is_configured": false, 00:15:16.376 "data_offset": 0, 00:15:16.376 "data_size": 0 00:15:16.376 } 00:15:16.376 ] 00:15:16.376 }' 00:15:16.376 15:51:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.376 15:51:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:17.315 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:17.315 [2024-07-12 15:51:37.677902] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:17.315 [2024-07-12 15:51:37.677928] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f7190 name Existed_Raid, state configuring 00:15:17.315 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:17.614 [2024-07-12 15:51:37.874429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:17.614 [2024-07-12 15:51:37.875525] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:17.614 [2024-07-12 15:51:37.875548] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:17.614 [2024-07-12 15:51:37.875554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:17.614 [2024-07-12 15:51:37.875560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:17.614 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.615 15:51:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:17.875 15:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:17.875 "name": "Existed_Raid", 00:15:17.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.875 "strip_size_kb": 64, 00:15:17.875 "state": "configuring", 00:15:17.875 "raid_level": "concat", 00:15:17.875 "superblock": false, 00:15:17.875 "num_base_bdevs": 3, 00:15:17.875 "num_base_bdevs_discovered": 1, 00:15:17.875 "num_base_bdevs_operational": 3, 00:15:17.875 "base_bdevs_list": [ 00:15:17.875 { 00:15:17.875 "name": "BaseBdev1", 00:15:17.875 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:17.875 "is_configured": true, 00:15:17.875 "data_offset": 0, 00:15:17.875 "data_size": 65536 00:15:17.875 }, 00:15:17.875 { 00:15:17.875 "name": "BaseBdev2", 00:15:17.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.875 "is_configured": false, 00:15:17.875 "data_offset": 0, 00:15:17.875 "data_size": 0 00:15:17.875 }, 00:15:17.875 { 00:15:17.875 "name": "BaseBdev3", 00:15:17.875 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:17.875 "is_configured": false, 00:15:17.875 "data_offset": 0, 00:15:17.875 "data_size": 0 00:15:17.875 } 00:15:17.875 ] 00:15:17.875 }' 00:15:17.875 15:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:17.875 15:51:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:18.817 15:51:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:18.817 [2024-07-12 15:51:39.150556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:18.817 BaseBdev2 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:18.817 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:19.077 [ 00:15:19.077 { 00:15:19.077 "name": "BaseBdev2", 00:15:19.077 "aliases": [ 00:15:19.077 "32f66b61-ef73-444b-bce5-452e903feec7" 00:15:19.077 ], 00:15:19.077 "product_name": "Malloc disk", 00:15:19.077 "block_size": 512, 00:15:19.077 "num_blocks": 65536, 00:15:19.077 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:19.077 "assigned_rate_limits": { 00:15:19.077 "rw_ios_per_sec": 0, 00:15:19.077 "rw_mbytes_per_sec": 0, 00:15:19.077 "r_mbytes_per_sec": 0, 00:15:19.077 "w_mbytes_per_sec": 0 00:15:19.077 }, 00:15:19.077 "claimed": true, 00:15:19.077 "claim_type": "exclusive_write", 00:15:19.077 "zoned": false, 00:15:19.077 "supported_io_types": { 00:15:19.077 "read": true, 00:15:19.077 "write": true, 00:15:19.077 "unmap": true, 00:15:19.077 "flush": true, 00:15:19.077 "reset": true, 00:15:19.077 "nvme_admin": false, 00:15:19.077 "nvme_io": false, 00:15:19.077 "nvme_io_md": false, 00:15:19.077 "write_zeroes": true, 00:15:19.077 "zcopy": true, 00:15:19.077 "get_zone_info": false, 00:15:19.077 "zone_management": false, 00:15:19.077 "zone_append": false, 00:15:19.077 "compare": false, 00:15:19.077 "compare_and_write": false, 00:15:19.077 "abort": true, 00:15:19.077 "seek_hole": false, 00:15:19.077 "seek_data": false, 00:15:19.077 "copy": true, 00:15:19.077 "nvme_iov_md": false 00:15:19.077 }, 00:15:19.077 "memory_domains": [ 00:15:19.077 { 00:15:19.077 "dma_device_id": "system", 00:15:19.077 "dma_device_type": 1 00:15:19.077 }, 00:15:19.077 { 00:15:19.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:19.077 "dma_device_type": 2 00:15:19.077 } 00:15:19.077 ], 00:15:19.077 "driver_specific": {} 00:15:19.077 } 00:15:19.077 ] 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.077 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.336 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.336 "name": "Existed_Raid", 00:15:19.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.336 "strip_size_kb": 64, 00:15:19.336 "state": "configuring", 00:15:19.336 "raid_level": "concat", 00:15:19.336 "superblock": false, 00:15:19.336 "num_base_bdevs": 3, 00:15:19.336 "num_base_bdevs_discovered": 2, 00:15:19.336 "num_base_bdevs_operational": 3, 00:15:19.336 "base_bdevs_list": [ 00:15:19.336 { 00:15:19.336 "name": "BaseBdev1", 00:15:19.336 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:19.336 "is_configured": true, 00:15:19.336 "data_offset": 0, 00:15:19.336 "data_size": 65536 00:15:19.336 }, 00:15:19.336 { 00:15:19.336 "name": "BaseBdev2", 00:15:19.336 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:19.336 "is_configured": true, 00:15:19.336 "data_offset": 0, 00:15:19.336 "data_size": 65536 00:15:19.336 }, 00:15:19.336 { 00:15:19.336 "name": "BaseBdev3", 00:15:19.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.336 "is_configured": false, 00:15:19.336 "data_offset": 0, 00:15:19.336 "data_size": 0 00:15:19.336 } 00:15:19.336 ] 00:15:19.336 }' 00:15:19.336 15:51:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.336 15:51:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:19.905 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:20.165 [2024-07-12 15:51:40.442587] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:20.165 [2024-07-12 15:51:40.442612] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13f8280 00:15:20.165 [2024-07-12 15:51:40.442616] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:20.165 [2024-07-12 15:51:40.442787] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f7d70 00:15:20.165 [2024-07-12 15:51:40.442879] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13f8280 00:15:20.165 [2024-07-12 15:51:40.442884] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13f8280 00:15:20.165 [2024-07-12 15:51:40.443002] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:20.165 BaseBdev3 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.165 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:20.425 [ 00:15:20.425 { 00:15:20.425 "name": "BaseBdev3", 00:15:20.425 "aliases": [ 00:15:20.425 "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd" 00:15:20.425 ], 00:15:20.425 "product_name": "Malloc disk", 00:15:20.425 "block_size": 512, 00:15:20.425 "num_blocks": 65536, 00:15:20.425 "uuid": "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd", 00:15:20.425 "assigned_rate_limits": { 00:15:20.425 "rw_ios_per_sec": 0, 00:15:20.425 "rw_mbytes_per_sec": 0, 00:15:20.425 "r_mbytes_per_sec": 0, 00:15:20.425 "w_mbytes_per_sec": 0 00:15:20.425 }, 00:15:20.425 "claimed": true, 00:15:20.425 "claim_type": "exclusive_write", 00:15:20.425 "zoned": false, 00:15:20.425 "supported_io_types": { 00:15:20.425 "read": true, 00:15:20.425 "write": true, 00:15:20.425 "unmap": true, 00:15:20.425 "flush": true, 00:15:20.425 "reset": true, 00:15:20.425 "nvme_admin": false, 00:15:20.425 "nvme_io": false, 00:15:20.425 "nvme_io_md": false, 00:15:20.425 "write_zeroes": true, 00:15:20.425 "zcopy": true, 00:15:20.425 "get_zone_info": false, 00:15:20.425 "zone_management": false, 00:15:20.425 "zone_append": false, 00:15:20.425 "compare": false, 00:15:20.425 "compare_and_write": false, 00:15:20.425 "abort": true, 00:15:20.425 "seek_hole": false, 00:15:20.425 "seek_data": false, 00:15:20.425 "copy": true, 00:15:20.425 "nvme_iov_md": false 00:15:20.425 }, 00:15:20.425 "memory_domains": [ 00:15:20.425 { 00:15:20.425 "dma_device_id": "system", 00:15:20.425 "dma_device_type": 1 00:15:20.425 }, 00:15:20.425 { 00:15:20.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.425 "dma_device_type": 2 00:15:20.425 } 00:15:20.425 ], 00:15:20.425 "driver_specific": {} 00:15:20.425 } 00:15:20.425 ] 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.425 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.685 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.685 "name": "Existed_Raid", 00:15:20.685 "uuid": "97d48a44-e230-49ff-9192-65e76b74c3f4", 00:15:20.685 "strip_size_kb": 64, 00:15:20.685 "state": "online", 00:15:20.685 "raid_level": "concat", 00:15:20.685 "superblock": false, 00:15:20.685 "num_base_bdevs": 3, 00:15:20.685 "num_base_bdevs_discovered": 3, 00:15:20.685 "num_base_bdevs_operational": 3, 00:15:20.685 "base_bdevs_list": [ 00:15:20.685 { 00:15:20.685 "name": "BaseBdev1", 00:15:20.685 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:20.685 "is_configured": true, 00:15:20.685 "data_offset": 0, 00:15:20.685 "data_size": 65536 00:15:20.685 }, 00:15:20.685 { 00:15:20.685 "name": "BaseBdev2", 00:15:20.685 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:20.685 "is_configured": true, 00:15:20.685 "data_offset": 0, 00:15:20.685 "data_size": 65536 00:15:20.685 }, 00:15:20.685 { 00:15:20.685 "name": "BaseBdev3", 00:15:20.685 "uuid": "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd", 00:15:20.685 "is_configured": true, 00:15:20.685 "data_offset": 0, 00:15:20.685 "data_size": 65536 00:15:20.685 } 00:15:20.685 ] 00:15:20.685 }' 00:15:20.685 15:51:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.685 15:51:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:21.255 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:21.255 [2024-07-12 15:51:41.694015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:21.513 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:21.513 "name": "Existed_Raid", 00:15:21.513 "aliases": [ 00:15:21.513 "97d48a44-e230-49ff-9192-65e76b74c3f4" 00:15:21.513 ], 00:15:21.513 "product_name": "Raid Volume", 00:15:21.513 "block_size": 512, 00:15:21.513 "num_blocks": 196608, 00:15:21.513 "uuid": "97d48a44-e230-49ff-9192-65e76b74c3f4", 00:15:21.513 "assigned_rate_limits": { 00:15:21.513 "rw_ios_per_sec": 0, 00:15:21.513 "rw_mbytes_per_sec": 0, 00:15:21.513 "r_mbytes_per_sec": 0, 00:15:21.513 "w_mbytes_per_sec": 0 00:15:21.513 }, 00:15:21.513 "claimed": false, 00:15:21.513 "zoned": false, 00:15:21.513 "supported_io_types": { 00:15:21.513 "read": true, 00:15:21.513 "write": true, 00:15:21.513 "unmap": true, 00:15:21.513 "flush": true, 00:15:21.513 "reset": true, 00:15:21.513 "nvme_admin": false, 00:15:21.513 "nvme_io": false, 00:15:21.513 "nvme_io_md": false, 00:15:21.513 "write_zeroes": true, 00:15:21.514 "zcopy": false, 00:15:21.514 "get_zone_info": false, 00:15:21.514 "zone_management": false, 00:15:21.514 "zone_append": false, 00:15:21.514 "compare": false, 00:15:21.514 "compare_and_write": false, 00:15:21.514 "abort": false, 00:15:21.514 "seek_hole": false, 00:15:21.514 "seek_data": false, 00:15:21.514 "copy": false, 00:15:21.514 "nvme_iov_md": false 00:15:21.514 }, 00:15:21.514 "memory_domains": [ 00:15:21.514 { 00:15:21.514 "dma_device_id": "system", 00:15:21.514 "dma_device_type": 1 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.514 "dma_device_type": 2 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "system", 00:15:21.514 "dma_device_type": 1 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.514 "dma_device_type": 2 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "system", 00:15:21.514 "dma_device_type": 1 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.514 "dma_device_type": 2 00:15:21.514 } 00:15:21.514 ], 00:15:21.514 "driver_specific": { 00:15:21.514 "raid": { 00:15:21.514 "uuid": "97d48a44-e230-49ff-9192-65e76b74c3f4", 00:15:21.514 "strip_size_kb": 64, 00:15:21.514 "state": "online", 00:15:21.514 "raid_level": "concat", 00:15:21.514 "superblock": false, 00:15:21.514 "num_base_bdevs": 3, 00:15:21.514 "num_base_bdevs_discovered": 3, 00:15:21.514 "num_base_bdevs_operational": 3, 00:15:21.514 "base_bdevs_list": [ 00:15:21.514 { 00:15:21.514 "name": "BaseBdev1", 00:15:21.514 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:21.514 "is_configured": true, 00:15:21.514 "data_offset": 0, 00:15:21.514 "data_size": 65536 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "name": "BaseBdev2", 00:15:21.514 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:21.514 "is_configured": true, 00:15:21.514 "data_offset": 0, 00:15:21.514 "data_size": 65536 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "name": "BaseBdev3", 00:15:21.514 "uuid": "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd", 00:15:21.514 "is_configured": true, 00:15:21.514 "data_offset": 0, 00:15:21.514 "data_size": 65536 00:15:21.514 } 00:15:21.514 ] 00:15:21.514 } 00:15:21.514 } 00:15:21.514 }' 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:21.514 BaseBdev2 00:15:21.514 BaseBdev3' 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:21.514 "name": "BaseBdev1", 00:15:21.514 "aliases": [ 00:15:21.514 "95111050-5311-492e-916d-0a210743f7e7" 00:15:21.514 ], 00:15:21.514 "product_name": "Malloc disk", 00:15:21.514 "block_size": 512, 00:15:21.514 "num_blocks": 65536, 00:15:21.514 "uuid": "95111050-5311-492e-916d-0a210743f7e7", 00:15:21.514 "assigned_rate_limits": { 00:15:21.514 "rw_ios_per_sec": 0, 00:15:21.514 "rw_mbytes_per_sec": 0, 00:15:21.514 "r_mbytes_per_sec": 0, 00:15:21.514 "w_mbytes_per_sec": 0 00:15:21.514 }, 00:15:21.514 "claimed": true, 00:15:21.514 "claim_type": "exclusive_write", 00:15:21.514 "zoned": false, 00:15:21.514 "supported_io_types": { 00:15:21.514 "read": true, 00:15:21.514 "write": true, 00:15:21.514 "unmap": true, 00:15:21.514 "flush": true, 00:15:21.514 "reset": true, 00:15:21.514 "nvme_admin": false, 00:15:21.514 "nvme_io": false, 00:15:21.514 "nvme_io_md": false, 00:15:21.514 "write_zeroes": true, 00:15:21.514 "zcopy": true, 00:15:21.514 "get_zone_info": false, 00:15:21.514 "zone_management": false, 00:15:21.514 "zone_append": false, 00:15:21.514 "compare": false, 00:15:21.514 "compare_and_write": false, 00:15:21.514 "abort": true, 00:15:21.514 "seek_hole": false, 00:15:21.514 "seek_data": false, 00:15:21.514 "copy": true, 00:15:21.514 "nvme_iov_md": false 00:15:21.514 }, 00:15:21.514 "memory_domains": [ 00:15:21.514 { 00:15:21.514 "dma_device_id": "system", 00:15:21.514 "dma_device_type": 1 00:15:21.514 }, 00:15:21.514 { 00:15:21.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.514 "dma_device_type": 2 00:15:21.514 } 00:15:21.514 ], 00:15:21.514 "driver_specific": {} 00:15:21.514 }' 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.514 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:21.773 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:21.773 15:51:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:21.773 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.033 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.033 "name": "BaseBdev2", 00:15:22.033 "aliases": [ 00:15:22.033 "32f66b61-ef73-444b-bce5-452e903feec7" 00:15:22.033 ], 00:15:22.033 "product_name": "Malloc disk", 00:15:22.033 "block_size": 512, 00:15:22.033 "num_blocks": 65536, 00:15:22.033 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:22.033 "assigned_rate_limits": { 00:15:22.033 "rw_ios_per_sec": 0, 00:15:22.033 "rw_mbytes_per_sec": 0, 00:15:22.033 "r_mbytes_per_sec": 0, 00:15:22.033 "w_mbytes_per_sec": 0 00:15:22.033 }, 00:15:22.033 "claimed": true, 00:15:22.033 "claim_type": "exclusive_write", 00:15:22.033 "zoned": false, 00:15:22.033 "supported_io_types": { 00:15:22.033 "read": true, 00:15:22.033 "write": true, 00:15:22.033 "unmap": true, 00:15:22.033 "flush": true, 00:15:22.033 "reset": true, 00:15:22.033 "nvme_admin": false, 00:15:22.033 "nvme_io": false, 00:15:22.033 "nvme_io_md": false, 00:15:22.033 "write_zeroes": true, 00:15:22.033 "zcopy": true, 00:15:22.033 "get_zone_info": false, 00:15:22.033 "zone_management": false, 00:15:22.033 "zone_append": false, 00:15:22.033 "compare": false, 00:15:22.034 "compare_and_write": false, 00:15:22.034 "abort": true, 00:15:22.034 "seek_hole": false, 00:15:22.034 "seek_data": false, 00:15:22.034 "copy": true, 00:15:22.034 "nvme_iov_md": false 00:15:22.034 }, 00:15:22.034 "memory_domains": [ 00:15:22.034 { 00:15:22.034 "dma_device_id": "system", 00:15:22.034 "dma_device_type": 1 00:15:22.034 }, 00:15:22.034 { 00:15:22.034 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.034 "dma_device_type": 2 00:15:22.034 } 00:15:22.034 ], 00:15:22.034 "driver_specific": {} 00:15:22.034 }' 00:15:22.034 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:22.294 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:22.554 15:51:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:22.819 "name": "BaseBdev3", 00:15:22.819 "aliases": [ 00:15:22.819 "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd" 00:15:22.819 ], 00:15:22.819 "product_name": "Malloc disk", 00:15:22.819 "block_size": 512, 00:15:22.819 "num_blocks": 65536, 00:15:22.819 "uuid": "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd", 00:15:22.819 "assigned_rate_limits": { 00:15:22.819 "rw_ios_per_sec": 0, 00:15:22.819 "rw_mbytes_per_sec": 0, 00:15:22.819 "r_mbytes_per_sec": 0, 00:15:22.819 "w_mbytes_per_sec": 0 00:15:22.819 }, 00:15:22.819 "claimed": true, 00:15:22.819 "claim_type": "exclusive_write", 00:15:22.819 "zoned": false, 00:15:22.819 "supported_io_types": { 00:15:22.819 "read": true, 00:15:22.819 "write": true, 00:15:22.819 "unmap": true, 00:15:22.819 "flush": true, 00:15:22.819 "reset": true, 00:15:22.819 "nvme_admin": false, 00:15:22.819 "nvme_io": false, 00:15:22.819 "nvme_io_md": false, 00:15:22.819 "write_zeroes": true, 00:15:22.819 "zcopy": true, 00:15:22.819 "get_zone_info": false, 00:15:22.819 "zone_management": false, 00:15:22.819 "zone_append": false, 00:15:22.819 "compare": false, 00:15:22.819 "compare_and_write": false, 00:15:22.819 "abort": true, 00:15:22.819 "seek_hole": false, 00:15:22.819 "seek_data": false, 00:15:22.819 "copy": true, 00:15:22.819 "nvme_iov_md": false 00:15:22.819 }, 00:15:22.819 "memory_domains": [ 00:15:22.819 { 00:15:22.819 "dma_device_id": "system", 00:15:22.819 "dma_device_type": 1 00:15:22.819 }, 00:15:22.819 { 00:15:22.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.819 "dma_device_type": 2 00:15:22.819 } 00:15:22.819 ], 00:15:22.819 "driver_specific": {} 00:15:22.819 }' 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:22.819 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:23.078 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:23.338 [2024-07-12 15:51:43.606659] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:23.338 [2024-07-12 15:51:43.606678] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:23.338 [2024-07-12 15:51:43.606716] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.338 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.599 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.599 "name": "Existed_Raid", 00:15:23.599 "uuid": "97d48a44-e230-49ff-9192-65e76b74c3f4", 00:15:23.599 "strip_size_kb": 64, 00:15:23.599 "state": "offline", 00:15:23.599 "raid_level": "concat", 00:15:23.599 "superblock": false, 00:15:23.599 "num_base_bdevs": 3, 00:15:23.599 "num_base_bdevs_discovered": 2, 00:15:23.599 "num_base_bdevs_operational": 2, 00:15:23.599 "base_bdevs_list": [ 00:15:23.599 { 00:15:23.599 "name": null, 00:15:23.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.599 "is_configured": false, 00:15:23.599 "data_offset": 0, 00:15:23.599 "data_size": 65536 00:15:23.599 }, 00:15:23.599 { 00:15:23.599 "name": "BaseBdev2", 00:15:23.599 "uuid": "32f66b61-ef73-444b-bce5-452e903feec7", 00:15:23.599 "is_configured": true, 00:15:23.599 "data_offset": 0, 00:15:23.599 "data_size": 65536 00:15:23.599 }, 00:15:23.599 { 00:15:23.599 "name": "BaseBdev3", 00:15:23.599 "uuid": "e7a04047-c5a6-4cdd-ae3e-77df8b6c6dfd", 00:15:23.599 "is_configured": true, 00:15:23.599 "data_offset": 0, 00:15:23.599 "data_size": 65536 00:15:23.599 } 00:15:23.599 ] 00:15:23.599 }' 00:15:23.599 15:51:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.599 15:51:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:24.169 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:24.429 [2024-07-12 15:51:44.737543] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:24.429 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:24.429 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:24.429 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.429 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:24.689 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:24.689 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:24.689 15:51:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:24.689 [2024-07-12 15:51:45.124423] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:24.689 [2024-07-12 15:51:45.124455] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f8280 name Existed_Raid, state offline 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:24.949 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:25.518 BaseBdev2 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:25.518 15:51:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:26.088 15:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:26.658 [ 00:15:26.658 { 00:15:26.658 "name": "BaseBdev2", 00:15:26.658 "aliases": [ 00:15:26.658 "c20010a1-c583-40b9-9f4b-773b0be74ec7" 00:15:26.658 ], 00:15:26.658 "product_name": "Malloc disk", 00:15:26.658 "block_size": 512, 00:15:26.658 "num_blocks": 65536, 00:15:26.658 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:26.658 "assigned_rate_limits": { 00:15:26.658 "rw_ios_per_sec": 0, 00:15:26.658 "rw_mbytes_per_sec": 0, 00:15:26.658 "r_mbytes_per_sec": 0, 00:15:26.658 "w_mbytes_per_sec": 0 00:15:26.658 }, 00:15:26.658 "claimed": false, 00:15:26.658 "zoned": false, 00:15:26.658 "supported_io_types": { 00:15:26.658 "read": true, 00:15:26.658 "write": true, 00:15:26.658 "unmap": true, 00:15:26.658 "flush": true, 00:15:26.658 "reset": true, 00:15:26.658 "nvme_admin": false, 00:15:26.658 "nvme_io": false, 00:15:26.658 "nvme_io_md": false, 00:15:26.658 "write_zeroes": true, 00:15:26.658 "zcopy": true, 00:15:26.658 "get_zone_info": false, 00:15:26.658 "zone_management": false, 00:15:26.658 "zone_append": false, 00:15:26.658 "compare": false, 00:15:26.658 "compare_and_write": false, 00:15:26.658 "abort": true, 00:15:26.658 "seek_hole": false, 00:15:26.658 "seek_data": false, 00:15:26.658 "copy": true, 00:15:26.658 "nvme_iov_md": false 00:15:26.658 }, 00:15:26.658 "memory_domains": [ 00:15:26.658 { 00:15:26.658 "dma_device_id": "system", 00:15:26.658 "dma_device_type": 1 00:15:26.658 }, 00:15:26.658 { 00:15:26.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.658 "dma_device_type": 2 00:15:26.658 } 00:15:26.658 ], 00:15:26.658 "driver_specific": {} 00:15:26.658 } 00:15:26.658 ] 00:15:26.658 15:51:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:26.658 15:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:26.658 15:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:26.658 15:51:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:27.229 BaseBdev3 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:27.229 15:51:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.800 15:51:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:28.370 [ 00:15:28.370 { 00:15:28.370 "name": "BaseBdev3", 00:15:28.370 "aliases": [ 00:15:28.370 "f47c33b2-fe6c-4951-80b1-9c4db16f7791" 00:15:28.370 ], 00:15:28.370 "product_name": "Malloc disk", 00:15:28.370 "block_size": 512, 00:15:28.370 "num_blocks": 65536, 00:15:28.370 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:28.370 "assigned_rate_limits": { 00:15:28.370 "rw_ios_per_sec": 0, 00:15:28.370 "rw_mbytes_per_sec": 0, 00:15:28.370 "r_mbytes_per_sec": 0, 00:15:28.370 "w_mbytes_per_sec": 0 00:15:28.370 }, 00:15:28.370 "claimed": false, 00:15:28.370 "zoned": false, 00:15:28.370 "supported_io_types": { 00:15:28.370 "read": true, 00:15:28.370 "write": true, 00:15:28.370 "unmap": true, 00:15:28.370 "flush": true, 00:15:28.370 "reset": true, 00:15:28.370 "nvme_admin": false, 00:15:28.370 "nvme_io": false, 00:15:28.370 "nvme_io_md": false, 00:15:28.370 "write_zeroes": true, 00:15:28.370 "zcopy": true, 00:15:28.370 "get_zone_info": false, 00:15:28.370 "zone_management": false, 00:15:28.370 "zone_append": false, 00:15:28.370 "compare": false, 00:15:28.370 "compare_and_write": false, 00:15:28.370 "abort": true, 00:15:28.370 "seek_hole": false, 00:15:28.370 "seek_data": false, 00:15:28.370 "copy": true, 00:15:28.370 "nvme_iov_md": false 00:15:28.370 }, 00:15:28.370 "memory_domains": [ 00:15:28.370 { 00:15:28.370 "dma_device_id": "system", 00:15:28.370 "dma_device_type": 1 00:15:28.370 }, 00:15:28.370 { 00:15:28.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.370 "dma_device_type": 2 00:15:28.370 } 00:15:28.370 ], 00:15:28.370 "driver_specific": {} 00:15:28.370 } 00:15:28.370 ] 00:15:28.370 15:51:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:28.370 15:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:28.370 15:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:28.370 15:51:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:28.940 [2024-07-12 15:51:49.094219] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:28.940 [2024-07-12 15:51:49.094246] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:28.940 [2024-07-12 15:51:49.094258] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:28.940 [2024-07-12 15:51:49.095488] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.940 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.511 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.511 "name": "Existed_Raid", 00:15:29.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.511 "strip_size_kb": 64, 00:15:29.511 "state": "configuring", 00:15:29.511 "raid_level": "concat", 00:15:29.511 "superblock": false, 00:15:29.511 "num_base_bdevs": 3, 00:15:29.511 "num_base_bdevs_discovered": 2, 00:15:29.511 "num_base_bdevs_operational": 3, 00:15:29.511 "base_bdevs_list": [ 00:15:29.511 { 00:15:29.511 "name": "BaseBdev1", 00:15:29.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:29.511 "is_configured": false, 00:15:29.511 "data_offset": 0, 00:15:29.511 "data_size": 0 00:15:29.511 }, 00:15:29.511 { 00:15:29.511 "name": "BaseBdev2", 00:15:29.511 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:29.511 "is_configured": true, 00:15:29.511 "data_offset": 0, 00:15:29.511 "data_size": 65536 00:15:29.511 }, 00:15:29.511 { 00:15:29.511 "name": "BaseBdev3", 00:15:29.511 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:29.511 "is_configured": true, 00:15:29.511 "data_offset": 0, 00:15:29.511 "data_size": 65536 00:15:29.511 } 00:15:29.511 ] 00:15:29.511 }' 00:15:29.511 15:51:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.511 15:51:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:29.770 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:30.030 [2024-07-12 15:51:50.373442] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.030 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.289 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.289 "name": "Existed_Raid", 00:15:30.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.289 "strip_size_kb": 64, 00:15:30.290 "state": "configuring", 00:15:30.290 "raid_level": "concat", 00:15:30.290 "superblock": false, 00:15:30.290 "num_base_bdevs": 3, 00:15:30.290 "num_base_bdevs_discovered": 1, 00:15:30.290 "num_base_bdevs_operational": 3, 00:15:30.290 "base_bdevs_list": [ 00:15:30.290 { 00:15:30.290 "name": "BaseBdev1", 00:15:30.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:30.290 "is_configured": false, 00:15:30.290 "data_offset": 0, 00:15:30.290 "data_size": 0 00:15:30.290 }, 00:15:30.290 { 00:15:30.290 "name": null, 00:15:30.290 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:30.290 "is_configured": false, 00:15:30.290 "data_offset": 0, 00:15:30.290 "data_size": 65536 00:15:30.290 }, 00:15:30.290 { 00:15:30.290 "name": "BaseBdev3", 00:15:30.290 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:30.290 "is_configured": true, 00:15:30.290 "data_offset": 0, 00:15:30.290 "data_size": 65536 00:15:30.290 } 00:15:30.290 ] 00:15:30.290 }' 00:15:30.290 15:51:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.290 15:51:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:30.858 15:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.858 15:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:31.118 15:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:31.118 15:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:31.686 [2024-07-12 15:51:51.825931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:31.686 BaseBdev1 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:31.686 15:51:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:32.254 15:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:32.513 [ 00:15:32.513 { 00:15:32.513 "name": "BaseBdev1", 00:15:32.513 "aliases": [ 00:15:32.513 "df71dc05-ecad-486c-a176-0ececf5bda29" 00:15:32.513 ], 00:15:32.513 "product_name": "Malloc disk", 00:15:32.513 "block_size": 512, 00:15:32.513 "num_blocks": 65536, 00:15:32.513 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:32.513 "assigned_rate_limits": { 00:15:32.513 "rw_ios_per_sec": 0, 00:15:32.513 "rw_mbytes_per_sec": 0, 00:15:32.513 "r_mbytes_per_sec": 0, 00:15:32.513 "w_mbytes_per_sec": 0 00:15:32.514 }, 00:15:32.514 "claimed": true, 00:15:32.514 "claim_type": "exclusive_write", 00:15:32.514 "zoned": false, 00:15:32.514 "supported_io_types": { 00:15:32.514 "read": true, 00:15:32.514 "write": true, 00:15:32.514 "unmap": true, 00:15:32.514 "flush": true, 00:15:32.514 "reset": true, 00:15:32.514 "nvme_admin": false, 00:15:32.514 "nvme_io": false, 00:15:32.514 "nvme_io_md": false, 00:15:32.514 "write_zeroes": true, 00:15:32.514 "zcopy": true, 00:15:32.514 "get_zone_info": false, 00:15:32.514 "zone_management": false, 00:15:32.514 "zone_append": false, 00:15:32.514 "compare": false, 00:15:32.514 "compare_and_write": false, 00:15:32.514 "abort": true, 00:15:32.514 "seek_hole": false, 00:15:32.514 "seek_data": false, 00:15:32.514 "copy": true, 00:15:32.514 "nvme_iov_md": false 00:15:32.514 }, 00:15:32.514 "memory_domains": [ 00:15:32.514 { 00:15:32.514 "dma_device_id": "system", 00:15:32.514 "dma_device_type": 1 00:15:32.514 }, 00:15:32.514 { 00:15:32.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.514 "dma_device_type": 2 00:15:32.514 } 00:15:32.514 ], 00:15:32.514 "driver_specific": {} 00:15:32.514 } 00:15:32.514 ] 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.514 15:51:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.080 15:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.081 "name": "Existed_Raid", 00:15:33.081 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:33.081 "strip_size_kb": 64, 00:15:33.081 "state": "configuring", 00:15:33.081 "raid_level": "concat", 00:15:33.081 "superblock": false, 00:15:33.081 "num_base_bdevs": 3, 00:15:33.081 "num_base_bdevs_discovered": 2, 00:15:33.081 "num_base_bdevs_operational": 3, 00:15:33.081 "base_bdevs_list": [ 00:15:33.081 { 00:15:33.081 "name": "BaseBdev1", 00:15:33.081 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:33.081 "is_configured": true, 00:15:33.081 "data_offset": 0, 00:15:33.081 "data_size": 65536 00:15:33.081 }, 00:15:33.081 { 00:15:33.081 "name": null, 00:15:33.081 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:33.081 "is_configured": false, 00:15:33.081 "data_offset": 0, 00:15:33.081 "data_size": 65536 00:15:33.081 }, 00:15:33.081 { 00:15:33.081 "name": "BaseBdev3", 00:15:33.081 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:33.081 "is_configured": true, 00:15:33.081 "data_offset": 0, 00:15:33.081 "data_size": 65536 00:15:33.081 } 00:15:33.081 ] 00:15:33.081 }' 00:15:33.081 15:51:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.081 15:51:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.649 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.649 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:33.909 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:33.909 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:34.479 [2024-07-12 15:51:54.713282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.479 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.787 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.787 "name": "Existed_Raid", 00:15:34.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:34.787 "strip_size_kb": 64, 00:15:34.787 "state": "configuring", 00:15:34.787 "raid_level": "concat", 00:15:34.787 "superblock": false, 00:15:34.787 "num_base_bdevs": 3, 00:15:34.787 "num_base_bdevs_discovered": 1, 00:15:34.787 "num_base_bdevs_operational": 3, 00:15:34.787 "base_bdevs_list": [ 00:15:34.787 { 00:15:34.787 "name": "BaseBdev1", 00:15:34.787 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:34.787 "is_configured": true, 00:15:34.787 "data_offset": 0, 00:15:34.787 "data_size": 65536 00:15:34.787 }, 00:15:34.787 { 00:15:34.787 "name": null, 00:15:34.787 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:34.787 "is_configured": false, 00:15:34.787 "data_offset": 0, 00:15:34.787 "data_size": 65536 00:15:34.787 }, 00:15:34.787 { 00:15:34.787 "name": null, 00:15:34.787 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:34.787 "is_configured": false, 00:15:34.787 "data_offset": 0, 00:15:34.787 "data_size": 65536 00:15:34.787 } 00:15:34.787 ] 00:15:34.787 }' 00:15:34.787 15:51:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.787 15:51:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:35.049 15:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.049 15:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:35.308 15:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:35.308 15:51:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:35.881 [2024-07-12 15:51:56.185022] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.881 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.140 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.140 "name": "Existed_Raid", 00:15:36.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:36.140 "strip_size_kb": 64, 00:15:36.140 "state": "configuring", 00:15:36.140 "raid_level": "concat", 00:15:36.140 "superblock": false, 00:15:36.140 "num_base_bdevs": 3, 00:15:36.140 "num_base_bdevs_discovered": 2, 00:15:36.140 "num_base_bdevs_operational": 3, 00:15:36.140 "base_bdevs_list": [ 00:15:36.140 { 00:15:36.140 "name": "BaseBdev1", 00:15:36.140 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:36.140 "is_configured": true, 00:15:36.140 "data_offset": 0, 00:15:36.140 "data_size": 65536 00:15:36.140 }, 00:15:36.140 { 00:15:36.140 "name": null, 00:15:36.140 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:36.140 "is_configured": false, 00:15:36.140 "data_offset": 0, 00:15:36.140 "data_size": 65536 00:15:36.140 }, 00:15:36.140 { 00:15:36.140 "name": "BaseBdev3", 00:15:36.140 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:36.140 "is_configured": true, 00:15:36.140 "data_offset": 0, 00:15:36.140 "data_size": 65536 00:15:36.140 } 00:15:36.140 ] 00:15:36.140 }' 00:15:36.140 15:51:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.140 15:51:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.710 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.710 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:36.969 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:36.969 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:36.969 [2024-07-12 15:51:57.408117] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:37.228 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:37.228 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.228 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.229 "name": "Existed_Raid", 00:15:37.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:37.229 "strip_size_kb": 64, 00:15:37.229 "state": "configuring", 00:15:37.229 "raid_level": "concat", 00:15:37.229 "superblock": false, 00:15:37.229 "num_base_bdevs": 3, 00:15:37.229 "num_base_bdevs_discovered": 1, 00:15:37.229 "num_base_bdevs_operational": 3, 00:15:37.229 "base_bdevs_list": [ 00:15:37.229 { 00:15:37.229 "name": null, 00:15:37.229 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:37.229 "is_configured": false, 00:15:37.229 "data_offset": 0, 00:15:37.229 "data_size": 65536 00:15:37.229 }, 00:15:37.229 { 00:15:37.229 "name": null, 00:15:37.229 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:37.229 "is_configured": false, 00:15:37.229 "data_offset": 0, 00:15:37.229 "data_size": 65536 00:15:37.229 }, 00:15:37.229 { 00:15:37.229 "name": "BaseBdev3", 00:15:37.229 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:37.229 "is_configured": true, 00:15:37.229 "data_offset": 0, 00:15:37.229 "data_size": 65536 00:15:37.229 } 00:15:37.229 ] 00:15:37.229 }' 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.229 15:51:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:37.798 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.798 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:38.058 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:38.058 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:38.318 [2024-07-12 15:51:58.512696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.318 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.318 "name": "Existed_Raid", 00:15:38.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:38.318 "strip_size_kb": 64, 00:15:38.318 "state": "configuring", 00:15:38.318 "raid_level": "concat", 00:15:38.318 "superblock": false, 00:15:38.319 "num_base_bdevs": 3, 00:15:38.319 "num_base_bdevs_discovered": 2, 00:15:38.319 "num_base_bdevs_operational": 3, 00:15:38.319 "base_bdevs_list": [ 00:15:38.319 { 00:15:38.319 "name": null, 00:15:38.319 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:38.319 "is_configured": false, 00:15:38.319 "data_offset": 0, 00:15:38.319 "data_size": 65536 00:15:38.319 }, 00:15:38.319 { 00:15:38.319 "name": "BaseBdev2", 00:15:38.319 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:38.319 "is_configured": true, 00:15:38.319 "data_offset": 0, 00:15:38.319 "data_size": 65536 00:15:38.319 }, 00:15:38.319 { 00:15:38.319 "name": "BaseBdev3", 00:15:38.319 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:38.319 "is_configured": true, 00:15:38.319 "data_offset": 0, 00:15:38.319 "data_size": 65536 00:15:38.319 } 00:15:38.319 ] 00:15:38.319 }' 00:15:38.319 15:51:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.319 15:51:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.887 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.887 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:39.146 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:39.146 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.146 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u df71dc05-ecad-486c-a176-0ececf5bda29 00:15:39.405 [2024-07-12 15:51:59.796920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:39.405 [2024-07-12 15:51:59.796943] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13fb350 00:15:39.405 [2024-07-12 15:51:59.796948] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:39.405 [2024-07-12 15:51:59.797089] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13f7d70 00:15:39.405 [2024-07-12 15:51:59.797176] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13fb350 00:15:39.405 [2024-07-12 15:51:59.797181] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13fb350 00:15:39.405 [2024-07-12 15:51:59.797298] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:39.405 NewBaseBdev 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:39.405 15:51:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:39.665 15:52:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:39.925 [ 00:15:39.925 { 00:15:39.925 "name": "NewBaseBdev", 00:15:39.925 "aliases": [ 00:15:39.925 "df71dc05-ecad-486c-a176-0ececf5bda29" 00:15:39.925 ], 00:15:39.925 "product_name": "Malloc disk", 00:15:39.925 "block_size": 512, 00:15:39.925 "num_blocks": 65536, 00:15:39.925 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:39.925 "assigned_rate_limits": { 00:15:39.925 "rw_ios_per_sec": 0, 00:15:39.925 "rw_mbytes_per_sec": 0, 00:15:39.925 "r_mbytes_per_sec": 0, 00:15:39.925 "w_mbytes_per_sec": 0 00:15:39.925 }, 00:15:39.925 "claimed": true, 00:15:39.925 "claim_type": "exclusive_write", 00:15:39.925 "zoned": false, 00:15:39.925 "supported_io_types": { 00:15:39.925 "read": true, 00:15:39.925 "write": true, 00:15:39.925 "unmap": true, 00:15:39.925 "flush": true, 00:15:39.925 "reset": true, 00:15:39.925 "nvme_admin": false, 00:15:39.925 "nvme_io": false, 00:15:39.925 "nvme_io_md": false, 00:15:39.925 "write_zeroes": true, 00:15:39.925 "zcopy": true, 00:15:39.925 "get_zone_info": false, 00:15:39.925 "zone_management": false, 00:15:39.925 "zone_append": false, 00:15:39.925 "compare": false, 00:15:39.925 "compare_and_write": false, 00:15:39.925 "abort": true, 00:15:39.925 "seek_hole": false, 00:15:39.925 "seek_data": false, 00:15:39.925 "copy": true, 00:15:39.925 "nvme_iov_md": false 00:15:39.925 }, 00:15:39.925 "memory_domains": [ 00:15:39.925 { 00:15:39.925 "dma_device_id": "system", 00:15:39.925 "dma_device_type": 1 00:15:39.925 }, 00:15:39.925 { 00:15:39.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:39.925 "dma_device_type": 2 00:15:39.925 } 00:15:39.925 ], 00:15:39.925 "driver_specific": {} 00:15:39.925 } 00:15:39.925 ] 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.925 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.184 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.184 "name": "Existed_Raid", 00:15:40.184 "uuid": "0c5c03e2-82e1-457d-8c41-32689029200e", 00:15:40.184 "strip_size_kb": 64, 00:15:40.184 "state": "online", 00:15:40.184 "raid_level": "concat", 00:15:40.184 "superblock": false, 00:15:40.184 "num_base_bdevs": 3, 00:15:40.184 "num_base_bdevs_discovered": 3, 00:15:40.184 "num_base_bdevs_operational": 3, 00:15:40.184 "base_bdevs_list": [ 00:15:40.184 { 00:15:40.184 "name": "NewBaseBdev", 00:15:40.184 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:40.184 "is_configured": true, 00:15:40.184 "data_offset": 0, 00:15:40.184 "data_size": 65536 00:15:40.184 }, 00:15:40.184 { 00:15:40.184 "name": "BaseBdev2", 00:15:40.184 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:40.184 "is_configured": true, 00:15:40.184 "data_offset": 0, 00:15:40.184 "data_size": 65536 00:15:40.184 }, 00:15:40.184 { 00:15:40.184 "name": "BaseBdev3", 00:15:40.184 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:40.184 "is_configured": true, 00:15:40.184 "data_offset": 0, 00:15:40.184 "data_size": 65536 00:15:40.184 } 00:15:40.184 ] 00:15:40.184 }' 00:15:40.185 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.185 15:52:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:40.754 15:52:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:40.754 [2024-07-12 15:52:01.116524] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:40.754 "name": "Existed_Raid", 00:15:40.754 "aliases": [ 00:15:40.754 "0c5c03e2-82e1-457d-8c41-32689029200e" 00:15:40.754 ], 00:15:40.754 "product_name": "Raid Volume", 00:15:40.754 "block_size": 512, 00:15:40.754 "num_blocks": 196608, 00:15:40.754 "uuid": "0c5c03e2-82e1-457d-8c41-32689029200e", 00:15:40.754 "assigned_rate_limits": { 00:15:40.754 "rw_ios_per_sec": 0, 00:15:40.754 "rw_mbytes_per_sec": 0, 00:15:40.754 "r_mbytes_per_sec": 0, 00:15:40.754 "w_mbytes_per_sec": 0 00:15:40.754 }, 00:15:40.754 "claimed": false, 00:15:40.754 "zoned": false, 00:15:40.754 "supported_io_types": { 00:15:40.754 "read": true, 00:15:40.754 "write": true, 00:15:40.754 "unmap": true, 00:15:40.754 "flush": true, 00:15:40.754 "reset": true, 00:15:40.754 "nvme_admin": false, 00:15:40.754 "nvme_io": false, 00:15:40.754 "nvme_io_md": false, 00:15:40.754 "write_zeroes": true, 00:15:40.754 "zcopy": false, 00:15:40.754 "get_zone_info": false, 00:15:40.754 "zone_management": false, 00:15:40.754 "zone_append": false, 00:15:40.754 "compare": false, 00:15:40.754 "compare_and_write": false, 00:15:40.754 "abort": false, 00:15:40.754 "seek_hole": false, 00:15:40.754 "seek_data": false, 00:15:40.754 "copy": false, 00:15:40.754 "nvme_iov_md": false 00:15:40.754 }, 00:15:40.754 "memory_domains": [ 00:15:40.754 { 00:15:40.754 "dma_device_id": "system", 00:15:40.754 "dma_device_type": 1 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.754 "dma_device_type": 2 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "dma_device_id": "system", 00:15:40.754 "dma_device_type": 1 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.754 "dma_device_type": 2 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "dma_device_id": "system", 00:15:40.754 "dma_device_type": 1 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:40.754 "dma_device_type": 2 00:15:40.754 } 00:15:40.754 ], 00:15:40.754 "driver_specific": { 00:15:40.754 "raid": { 00:15:40.754 "uuid": "0c5c03e2-82e1-457d-8c41-32689029200e", 00:15:40.754 "strip_size_kb": 64, 00:15:40.754 "state": "online", 00:15:40.754 "raid_level": "concat", 00:15:40.754 "superblock": false, 00:15:40.754 "num_base_bdevs": 3, 00:15:40.754 "num_base_bdevs_discovered": 3, 00:15:40.754 "num_base_bdevs_operational": 3, 00:15:40.754 "base_bdevs_list": [ 00:15:40.754 { 00:15:40.754 "name": "NewBaseBdev", 00:15:40.754 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:40.754 "is_configured": true, 00:15:40.754 "data_offset": 0, 00:15:40.754 "data_size": 65536 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "name": "BaseBdev2", 00:15:40.754 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:40.754 "is_configured": true, 00:15:40.754 "data_offset": 0, 00:15:40.754 "data_size": 65536 00:15:40.754 }, 00:15:40.754 { 00:15:40.754 "name": "BaseBdev3", 00:15:40.754 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:40.754 "is_configured": true, 00:15:40.754 "data_offset": 0, 00:15:40.754 "data_size": 65536 00:15:40.754 } 00:15:40.754 ] 00:15:40.754 } 00:15:40.754 } 00:15:40.754 }' 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:40.754 BaseBdev2 00:15:40.754 BaseBdev3' 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:40.754 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.013 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.013 "name": "NewBaseBdev", 00:15:41.013 "aliases": [ 00:15:41.013 "df71dc05-ecad-486c-a176-0ececf5bda29" 00:15:41.013 ], 00:15:41.013 "product_name": "Malloc disk", 00:15:41.013 "block_size": 512, 00:15:41.013 "num_blocks": 65536, 00:15:41.013 "uuid": "df71dc05-ecad-486c-a176-0ececf5bda29", 00:15:41.013 "assigned_rate_limits": { 00:15:41.013 "rw_ios_per_sec": 0, 00:15:41.013 "rw_mbytes_per_sec": 0, 00:15:41.013 "r_mbytes_per_sec": 0, 00:15:41.013 "w_mbytes_per_sec": 0 00:15:41.013 }, 00:15:41.013 "claimed": true, 00:15:41.013 "claim_type": "exclusive_write", 00:15:41.013 "zoned": false, 00:15:41.013 "supported_io_types": { 00:15:41.013 "read": true, 00:15:41.013 "write": true, 00:15:41.013 "unmap": true, 00:15:41.013 "flush": true, 00:15:41.013 "reset": true, 00:15:41.013 "nvme_admin": false, 00:15:41.013 "nvme_io": false, 00:15:41.013 "nvme_io_md": false, 00:15:41.013 "write_zeroes": true, 00:15:41.013 "zcopy": true, 00:15:41.013 "get_zone_info": false, 00:15:41.013 "zone_management": false, 00:15:41.013 "zone_append": false, 00:15:41.013 "compare": false, 00:15:41.013 "compare_and_write": false, 00:15:41.013 "abort": true, 00:15:41.013 "seek_hole": false, 00:15:41.013 "seek_data": false, 00:15:41.013 "copy": true, 00:15:41.013 "nvme_iov_md": false 00:15:41.013 }, 00:15:41.013 "memory_domains": [ 00:15:41.013 { 00:15:41.013 "dma_device_id": "system", 00:15:41.013 "dma_device_type": 1 00:15:41.013 }, 00:15:41.013 { 00:15:41.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.013 "dma_device_type": 2 00:15:41.013 } 00:15:41.013 ], 00:15:41.013 "driver_specific": {} 00:15:41.013 }' 00:15:41.013 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.013 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.272 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.273 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:41.532 "name": "BaseBdev2", 00:15:41.532 "aliases": [ 00:15:41.532 "c20010a1-c583-40b9-9f4b-773b0be74ec7" 00:15:41.532 ], 00:15:41.532 "product_name": "Malloc disk", 00:15:41.532 "block_size": 512, 00:15:41.532 "num_blocks": 65536, 00:15:41.532 "uuid": "c20010a1-c583-40b9-9f4b-773b0be74ec7", 00:15:41.532 "assigned_rate_limits": { 00:15:41.532 "rw_ios_per_sec": 0, 00:15:41.532 "rw_mbytes_per_sec": 0, 00:15:41.532 "r_mbytes_per_sec": 0, 00:15:41.532 "w_mbytes_per_sec": 0 00:15:41.532 }, 00:15:41.532 "claimed": true, 00:15:41.532 "claim_type": "exclusive_write", 00:15:41.532 "zoned": false, 00:15:41.532 "supported_io_types": { 00:15:41.532 "read": true, 00:15:41.532 "write": true, 00:15:41.532 "unmap": true, 00:15:41.532 "flush": true, 00:15:41.532 "reset": true, 00:15:41.532 "nvme_admin": false, 00:15:41.532 "nvme_io": false, 00:15:41.532 "nvme_io_md": false, 00:15:41.532 "write_zeroes": true, 00:15:41.532 "zcopy": true, 00:15:41.532 "get_zone_info": false, 00:15:41.532 "zone_management": false, 00:15:41.532 "zone_append": false, 00:15:41.532 "compare": false, 00:15:41.532 "compare_and_write": false, 00:15:41.532 "abort": true, 00:15:41.532 "seek_hole": false, 00:15:41.532 "seek_data": false, 00:15:41.532 "copy": true, 00:15:41.532 "nvme_iov_md": false 00:15:41.532 }, 00:15:41.532 "memory_domains": [ 00:15:41.532 { 00:15:41.532 "dma_device_id": "system", 00:15:41.532 "dma_device_type": 1 00:15:41.532 }, 00:15:41.532 { 00:15:41.532 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:41.532 "dma_device_type": 2 00:15:41.532 } 00:15:41.532 ], 00:15:41.532 "driver_specific": {} 00:15:41.532 }' 00:15:41.532 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.792 15:52:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:41.792 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.052 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.052 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.052 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:42.052 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.052 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.052 "name": "BaseBdev3", 00:15:42.052 "aliases": [ 00:15:42.052 "f47c33b2-fe6c-4951-80b1-9c4db16f7791" 00:15:42.052 ], 00:15:42.052 "product_name": "Malloc disk", 00:15:42.052 "block_size": 512, 00:15:42.052 "num_blocks": 65536, 00:15:42.052 "uuid": "f47c33b2-fe6c-4951-80b1-9c4db16f7791", 00:15:42.052 "assigned_rate_limits": { 00:15:42.052 "rw_ios_per_sec": 0, 00:15:42.052 "rw_mbytes_per_sec": 0, 00:15:42.052 "r_mbytes_per_sec": 0, 00:15:42.052 "w_mbytes_per_sec": 0 00:15:42.052 }, 00:15:42.052 "claimed": true, 00:15:42.052 "claim_type": "exclusive_write", 00:15:42.052 "zoned": false, 00:15:42.052 "supported_io_types": { 00:15:42.052 "read": true, 00:15:42.052 "write": true, 00:15:42.052 "unmap": true, 00:15:42.052 "flush": true, 00:15:42.052 "reset": true, 00:15:42.052 "nvme_admin": false, 00:15:42.052 "nvme_io": false, 00:15:42.052 "nvme_io_md": false, 00:15:42.052 "write_zeroes": true, 00:15:42.052 "zcopy": true, 00:15:42.052 "get_zone_info": false, 00:15:42.052 "zone_management": false, 00:15:42.052 "zone_append": false, 00:15:42.052 "compare": false, 00:15:42.052 "compare_and_write": false, 00:15:42.052 "abort": true, 00:15:42.052 "seek_hole": false, 00:15:42.052 "seek_data": false, 00:15:42.052 "copy": true, 00:15:42.052 "nvme_iov_md": false 00:15:42.052 }, 00:15:42.052 "memory_domains": [ 00:15:42.052 { 00:15:42.052 "dma_device_id": "system", 00:15:42.052 "dma_device_type": 1 00:15:42.052 }, 00:15:42.052 { 00:15:42.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.052 "dma_device_type": 2 00:15:42.052 } 00:15:42.052 ], 00:15:42.053 "driver_specific": {} 00:15:42.053 }' 00:15:42.053 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.334 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:42.594 [2024-07-12 15:52:02.960958] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:42.594 [2024-07-12 15:52:02.960978] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:42.594 [2024-07-12 15:52:02.961020] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:42.594 [2024-07-12 15:52:02.961058] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:42.594 [2024-07-12 15:52:02.961064] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13fb350 name Existed_Raid, state offline 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2536570 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2536570 ']' 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2536570 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:42.594 15:52:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2536570 00:15:42.594 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:42.594 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:42.594 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2536570' 00:15:42.594 killing process with pid 2536570 00:15:42.594 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2536570 00:15:42.594 [2024-07-12 15:52:03.026164] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:42.594 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2536570 00:15:42.594 [2024-07-12 15:52:03.041057] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:42.854 00:15:42.854 real 0m30.086s 00:15:42.854 user 0m56.608s 00:15:42.854 sys 0m4.010s 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.854 ************************************ 00:15:42.854 END TEST raid_state_function_test 00:15:42.854 ************************************ 00:15:42.854 15:52:03 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:42.854 15:52:03 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:42.854 15:52:03 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:42.854 15:52:03 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:42.854 15:52:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:42.854 ************************************ 00:15:42.854 START TEST raid_state_function_test_sb 00:15:42.854 ************************************ 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2542223 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2542223' 00:15:42.854 Process raid pid: 2542223 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2542223 /var/tmp/spdk-raid.sock 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2542223 ']' 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:42.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:42.854 15:52:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:43.114 [2024-07-12 15:52:03.304367] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:15:43.114 [2024-07-12 15:52:03.304420] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:43.114 [2024-07-12 15:52:03.394791] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:43.114 [2024-07-12 15:52:03.461249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:43.114 [2024-07-12 15:52:03.500472] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:43.114 [2024-07-12 15:52:03.500494] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:44.052 [2024-07-12 15:52:04.311320] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:44.052 [2024-07-12 15:52:04.311348] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:44.052 [2024-07-12 15:52:04.311354] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:44.052 [2024-07-12 15:52:04.311361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:44.052 [2024-07-12 15:52:04.311365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:44.052 [2024-07-12 15:52:04.311370] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:44.052 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:44.053 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.053 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:44.312 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:44.312 "name": "Existed_Raid", 00:15:44.312 "uuid": "c4dfbf63-6c2d-491e-84e9-57bdb4b711cd", 00:15:44.312 "strip_size_kb": 64, 00:15:44.312 "state": "configuring", 00:15:44.312 "raid_level": "concat", 00:15:44.312 "superblock": true, 00:15:44.312 "num_base_bdevs": 3, 00:15:44.312 "num_base_bdevs_discovered": 0, 00:15:44.312 "num_base_bdevs_operational": 3, 00:15:44.312 "base_bdevs_list": [ 00:15:44.312 { 00:15:44.312 "name": "BaseBdev1", 00:15:44.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.312 "is_configured": false, 00:15:44.312 "data_offset": 0, 00:15:44.312 "data_size": 0 00:15:44.312 }, 00:15:44.312 { 00:15:44.312 "name": "BaseBdev2", 00:15:44.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.312 "is_configured": false, 00:15:44.312 "data_offset": 0, 00:15:44.312 "data_size": 0 00:15:44.312 }, 00:15:44.312 { 00:15:44.312 "name": "BaseBdev3", 00:15:44.312 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:44.312 "is_configured": false, 00:15:44.312 "data_offset": 0, 00:15:44.312 "data_size": 0 00:15:44.312 } 00:15:44.312 ] 00:15:44.312 }' 00:15:44.312 15:52:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:44.312 15:52:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:44.880 15:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:44.880 [2024-07-12 15:52:05.225520] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:44.880 [2024-07-12 15:52:05.225536] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c52900 name Existed_Raid, state configuring 00:15:44.880 15:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:45.139 [2024-07-12 15:52:05.422039] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:45.139 [2024-07-12 15:52:05.422055] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:45.139 [2024-07-12 15:52:05.422060] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:45.139 [2024-07-12 15:52:05.422066] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:45.139 [2024-07-12 15:52:05.422071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:45.139 [2024-07-12 15:52:05.422077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:45.139 15:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:45.399 [2024-07-12 15:52:05.621068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:45.399 BaseBdev1 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:45.399 15:52:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:45.658 [ 00:15:45.658 { 00:15:45.658 "name": "BaseBdev1", 00:15:45.658 "aliases": [ 00:15:45.658 "75c11989-f3ae-450a-ba93-ebea1e3ba6bd" 00:15:45.658 ], 00:15:45.658 "product_name": "Malloc disk", 00:15:45.658 "block_size": 512, 00:15:45.658 "num_blocks": 65536, 00:15:45.658 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:45.658 "assigned_rate_limits": { 00:15:45.658 "rw_ios_per_sec": 0, 00:15:45.658 "rw_mbytes_per_sec": 0, 00:15:45.658 "r_mbytes_per_sec": 0, 00:15:45.658 "w_mbytes_per_sec": 0 00:15:45.658 }, 00:15:45.658 "claimed": true, 00:15:45.658 "claim_type": "exclusive_write", 00:15:45.658 "zoned": false, 00:15:45.658 "supported_io_types": { 00:15:45.659 "read": true, 00:15:45.659 "write": true, 00:15:45.659 "unmap": true, 00:15:45.659 "flush": true, 00:15:45.659 "reset": true, 00:15:45.659 "nvme_admin": false, 00:15:45.659 "nvme_io": false, 00:15:45.659 "nvme_io_md": false, 00:15:45.659 "write_zeroes": true, 00:15:45.659 "zcopy": true, 00:15:45.659 "get_zone_info": false, 00:15:45.659 "zone_management": false, 00:15:45.659 "zone_append": false, 00:15:45.659 "compare": false, 00:15:45.659 "compare_and_write": false, 00:15:45.659 "abort": true, 00:15:45.659 "seek_hole": false, 00:15:45.659 "seek_data": false, 00:15:45.659 "copy": true, 00:15:45.659 "nvme_iov_md": false 00:15:45.659 }, 00:15:45.659 "memory_domains": [ 00:15:45.659 { 00:15:45.659 "dma_device_id": "system", 00:15:45.659 "dma_device_type": 1 00:15:45.659 }, 00:15:45.659 { 00:15:45.659 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.659 "dma_device_type": 2 00:15:45.659 } 00:15:45.659 ], 00:15:45.659 "driver_specific": {} 00:15:45.659 } 00:15:45.659 ] 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:45.659 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:45.918 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:45.918 "name": "Existed_Raid", 00:15:45.918 "uuid": "1fc498cd-db8c-458a-a5d7-5cc27e927529", 00:15:45.918 "strip_size_kb": 64, 00:15:45.918 "state": "configuring", 00:15:45.918 "raid_level": "concat", 00:15:45.918 "superblock": true, 00:15:45.918 "num_base_bdevs": 3, 00:15:45.918 "num_base_bdevs_discovered": 1, 00:15:45.918 "num_base_bdevs_operational": 3, 00:15:45.918 "base_bdevs_list": [ 00:15:45.918 { 00:15:45.918 "name": "BaseBdev1", 00:15:45.918 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:45.918 "is_configured": true, 00:15:45.918 "data_offset": 2048, 00:15:45.918 "data_size": 63488 00:15:45.918 }, 00:15:45.918 { 00:15:45.918 "name": "BaseBdev2", 00:15:45.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.918 "is_configured": false, 00:15:45.918 "data_offset": 0, 00:15:45.918 "data_size": 0 00:15:45.918 }, 00:15:45.918 { 00:15:45.918 "name": "BaseBdev3", 00:15:45.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:45.918 "is_configured": false, 00:15:45.918 "data_offset": 0, 00:15:45.918 "data_size": 0 00:15:45.918 } 00:15:45.919 ] 00:15:45.919 }' 00:15:45.919 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:45.919 15:52:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:46.488 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:46.488 [2024-07-12 15:52:06.900277] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:46.488 [2024-07-12 15:52:06.900302] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c52190 name Existed_Raid, state configuring 00:15:46.488 15:52:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:46.748 [2024-07-12 15:52:07.092796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:46.748 [2024-07-12 15:52:07.093879] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:46.748 [2024-07-12 15:52:07.093901] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:46.748 [2024-07-12 15:52:07.093906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:46.748 [2024-07-12 15:52:07.093912] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.748 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:47.007 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.007 "name": "Existed_Raid", 00:15:47.007 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:47.007 "strip_size_kb": 64, 00:15:47.007 "state": "configuring", 00:15:47.007 "raid_level": "concat", 00:15:47.007 "superblock": true, 00:15:47.007 "num_base_bdevs": 3, 00:15:47.007 "num_base_bdevs_discovered": 1, 00:15:47.007 "num_base_bdevs_operational": 3, 00:15:47.007 "base_bdevs_list": [ 00:15:47.007 { 00:15:47.007 "name": "BaseBdev1", 00:15:47.007 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:47.007 "is_configured": true, 00:15:47.007 "data_offset": 2048, 00:15:47.007 "data_size": 63488 00:15:47.007 }, 00:15:47.007 { 00:15:47.007 "name": "BaseBdev2", 00:15:47.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.007 "is_configured": false, 00:15:47.007 "data_offset": 0, 00:15:47.007 "data_size": 0 00:15:47.007 }, 00:15:47.007 { 00:15:47.007 "name": "BaseBdev3", 00:15:47.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:47.007 "is_configured": false, 00:15:47.007 "data_offset": 0, 00:15:47.007 "data_size": 0 00:15:47.007 } 00:15:47.007 ] 00:15:47.007 }' 00:15:47.007 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.007 15:52:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:47.576 15:52:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:47.835 [2024-07-12 15:52:08.024077] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:47.835 BaseBdev2 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:47.835 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:48.095 [ 00:15:48.095 { 00:15:48.095 "name": "BaseBdev2", 00:15:48.095 "aliases": [ 00:15:48.095 "51b17af7-21cf-45e5-94a7-805c01028847" 00:15:48.095 ], 00:15:48.095 "product_name": "Malloc disk", 00:15:48.095 "block_size": 512, 00:15:48.095 "num_blocks": 65536, 00:15:48.095 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:48.095 "assigned_rate_limits": { 00:15:48.095 "rw_ios_per_sec": 0, 00:15:48.095 "rw_mbytes_per_sec": 0, 00:15:48.095 "r_mbytes_per_sec": 0, 00:15:48.095 "w_mbytes_per_sec": 0 00:15:48.095 }, 00:15:48.095 "claimed": true, 00:15:48.095 "claim_type": "exclusive_write", 00:15:48.095 "zoned": false, 00:15:48.095 "supported_io_types": { 00:15:48.095 "read": true, 00:15:48.095 "write": true, 00:15:48.095 "unmap": true, 00:15:48.095 "flush": true, 00:15:48.095 "reset": true, 00:15:48.095 "nvme_admin": false, 00:15:48.095 "nvme_io": false, 00:15:48.095 "nvme_io_md": false, 00:15:48.095 "write_zeroes": true, 00:15:48.095 "zcopy": true, 00:15:48.095 "get_zone_info": false, 00:15:48.095 "zone_management": false, 00:15:48.095 "zone_append": false, 00:15:48.095 "compare": false, 00:15:48.095 "compare_and_write": false, 00:15:48.095 "abort": true, 00:15:48.095 "seek_hole": false, 00:15:48.095 "seek_data": false, 00:15:48.095 "copy": true, 00:15:48.095 "nvme_iov_md": false 00:15:48.095 }, 00:15:48.095 "memory_domains": [ 00:15:48.095 { 00:15:48.095 "dma_device_id": "system", 00:15:48.095 "dma_device_type": 1 00:15:48.095 }, 00:15:48.095 { 00:15:48.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:48.095 "dma_device_type": 2 00:15:48.095 } 00:15:48.095 ], 00:15:48.095 "driver_specific": {} 00:15:48.095 } 00:15:48.095 ] 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.095 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:48.355 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.355 "name": "Existed_Raid", 00:15:48.355 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:48.355 "strip_size_kb": 64, 00:15:48.355 "state": "configuring", 00:15:48.355 "raid_level": "concat", 00:15:48.355 "superblock": true, 00:15:48.355 "num_base_bdevs": 3, 00:15:48.355 "num_base_bdevs_discovered": 2, 00:15:48.355 "num_base_bdevs_operational": 3, 00:15:48.355 "base_bdevs_list": [ 00:15:48.355 { 00:15:48.355 "name": "BaseBdev1", 00:15:48.355 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:48.355 "is_configured": true, 00:15:48.355 "data_offset": 2048, 00:15:48.355 "data_size": 63488 00:15:48.355 }, 00:15:48.355 { 00:15:48.355 "name": "BaseBdev2", 00:15:48.355 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:48.355 "is_configured": true, 00:15:48.355 "data_offset": 2048, 00:15:48.355 "data_size": 63488 00:15:48.355 }, 00:15:48.355 { 00:15:48.355 "name": "BaseBdev3", 00:15:48.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:48.355 "is_configured": false, 00:15:48.355 "data_offset": 0, 00:15:48.355 "data_size": 0 00:15:48.355 } 00:15:48.355 ] 00:15:48.355 }' 00:15:48.355 15:52:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.355 15:52:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:48.924 [2024-07-12 15:52:09.328373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.924 [2024-07-12 15:52:09.328493] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c53280 00:15:48.924 [2024-07-12 15:52:09.328501] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:48.924 [2024-07-12 15:52:09.328640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c52d70 00:15:48.924 [2024-07-12 15:52:09.328740] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c53280 00:15:48.924 [2024-07-12 15:52:09.328746] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c53280 00:15:48.924 [2024-07-12 15:52:09.328814] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.924 BaseBdev3 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:48.924 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:49.183 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:49.442 [ 00:15:49.442 { 00:15:49.442 "name": "BaseBdev3", 00:15:49.442 "aliases": [ 00:15:49.442 "3af5e321-c2b6-46eb-9e58-0ff024d89866" 00:15:49.442 ], 00:15:49.442 "product_name": "Malloc disk", 00:15:49.442 "block_size": 512, 00:15:49.442 "num_blocks": 65536, 00:15:49.442 "uuid": "3af5e321-c2b6-46eb-9e58-0ff024d89866", 00:15:49.442 "assigned_rate_limits": { 00:15:49.442 "rw_ios_per_sec": 0, 00:15:49.442 "rw_mbytes_per_sec": 0, 00:15:49.442 "r_mbytes_per_sec": 0, 00:15:49.442 "w_mbytes_per_sec": 0 00:15:49.442 }, 00:15:49.442 "claimed": true, 00:15:49.442 "claim_type": "exclusive_write", 00:15:49.442 "zoned": false, 00:15:49.442 "supported_io_types": { 00:15:49.442 "read": true, 00:15:49.442 "write": true, 00:15:49.442 "unmap": true, 00:15:49.442 "flush": true, 00:15:49.442 "reset": true, 00:15:49.442 "nvme_admin": false, 00:15:49.442 "nvme_io": false, 00:15:49.442 "nvme_io_md": false, 00:15:49.442 "write_zeroes": true, 00:15:49.442 "zcopy": true, 00:15:49.442 "get_zone_info": false, 00:15:49.442 "zone_management": false, 00:15:49.442 "zone_append": false, 00:15:49.442 "compare": false, 00:15:49.442 "compare_and_write": false, 00:15:49.442 "abort": true, 00:15:49.442 "seek_hole": false, 00:15:49.442 "seek_data": false, 00:15:49.442 "copy": true, 00:15:49.442 "nvme_iov_md": false 00:15:49.442 }, 00:15:49.442 "memory_domains": [ 00:15:49.442 { 00:15:49.442 "dma_device_id": "system", 00:15:49.442 "dma_device_type": 1 00:15:49.442 }, 00:15:49.442 { 00:15:49.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.442 "dma_device_type": 2 00:15:49.442 } 00:15:49.442 ], 00:15:49.442 "driver_specific": {} 00:15:49.442 } 00:15:49.442 ] 00:15:49.442 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.443 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:49.702 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.702 "name": "Existed_Raid", 00:15:49.702 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:49.702 "strip_size_kb": 64, 00:15:49.702 "state": "online", 00:15:49.702 "raid_level": "concat", 00:15:49.702 "superblock": true, 00:15:49.702 "num_base_bdevs": 3, 00:15:49.702 "num_base_bdevs_discovered": 3, 00:15:49.702 "num_base_bdevs_operational": 3, 00:15:49.702 "base_bdevs_list": [ 00:15:49.702 { 00:15:49.702 "name": "BaseBdev1", 00:15:49.702 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:49.702 "is_configured": true, 00:15:49.702 "data_offset": 2048, 00:15:49.702 "data_size": 63488 00:15:49.702 }, 00:15:49.702 { 00:15:49.702 "name": "BaseBdev2", 00:15:49.702 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:49.702 "is_configured": true, 00:15:49.702 "data_offset": 2048, 00:15:49.702 "data_size": 63488 00:15:49.702 }, 00:15:49.702 { 00:15:49.702 "name": "BaseBdev3", 00:15:49.702 "uuid": "3af5e321-c2b6-46eb-9e58-0ff024d89866", 00:15:49.702 "is_configured": true, 00:15:49.702 "data_offset": 2048, 00:15:49.702 "data_size": 63488 00:15:49.702 } 00:15:49.702 ] 00:15:49.702 }' 00:15:49.702 15:52:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.702 15:52:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:50.269 [2024-07-12 15:52:10.680479] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:50.269 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:50.269 "name": "Existed_Raid", 00:15:50.269 "aliases": [ 00:15:50.269 "b4fead49-5890-4041-98a9-3638c0bc33e7" 00:15:50.269 ], 00:15:50.269 "product_name": "Raid Volume", 00:15:50.269 "block_size": 512, 00:15:50.269 "num_blocks": 190464, 00:15:50.269 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:50.269 "assigned_rate_limits": { 00:15:50.269 "rw_ios_per_sec": 0, 00:15:50.269 "rw_mbytes_per_sec": 0, 00:15:50.269 "r_mbytes_per_sec": 0, 00:15:50.269 "w_mbytes_per_sec": 0 00:15:50.269 }, 00:15:50.269 "claimed": false, 00:15:50.269 "zoned": false, 00:15:50.269 "supported_io_types": { 00:15:50.269 "read": true, 00:15:50.269 "write": true, 00:15:50.269 "unmap": true, 00:15:50.269 "flush": true, 00:15:50.269 "reset": true, 00:15:50.269 "nvme_admin": false, 00:15:50.269 "nvme_io": false, 00:15:50.269 "nvme_io_md": false, 00:15:50.269 "write_zeroes": true, 00:15:50.269 "zcopy": false, 00:15:50.269 "get_zone_info": false, 00:15:50.269 "zone_management": false, 00:15:50.269 "zone_append": false, 00:15:50.269 "compare": false, 00:15:50.269 "compare_and_write": false, 00:15:50.269 "abort": false, 00:15:50.269 "seek_hole": false, 00:15:50.269 "seek_data": false, 00:15:50.269 "copy": false, 00:15:50.269 "nvme_iov_md": false 00:15:50.269 }, 00:15:50.269 "memory_domains": [ 00:15:50.269 { 00:15:50.269 "dma_device_id": "system", 00:15:50.269 "dma_device_type": 1 00:15:50.269 }, 00:15:50.269 { 00:15:50.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.269 "dma_device_type": 2 00:15:50.269 }, 00:15:50.269 { 00:15:50.269 "dma_device_id": "system", 00:15:50.269 "dma_device_type": 1 00:15:50.269 }, 00:15:50.269 { 00:15:50.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.269 "dma_device_type": 2 00:15:50.269 }, 00:15:50.269 { 00:15:50.269 "dma_device_id": "system", 00:15:50.270 "dma_device_type": 1 00:15:50.270 }, 00:15:50.270 { 00:15:50.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.270 "dma_device_type": 2 00:15:50.270 } 00:15:50.270 ], 00:15:50.270 "driver_specific": { 00:15:50.270 "raid": { 00:15:50.270 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:50.270 "strip_size_kb": 64, 00:15:50.270 "state": "online", 00:15:50.270 "raid_level": "concat", 00:15:50.270 "superblock": true, 00:15:50.270 "num_base_bdevs": 3, 00:15:50.270 "num_base_bdevs_discovered": 3, 00:15:50.270 "num_base_bdevs_operational": 3, 00:15:50.270 "base_bdevs_list": [ 00:15:50.270 { 00:15:50.270 "name": "BaseBdev1", 00:15:50.270 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:50.270 "is_configured": true, 00:15:50.270 "data_offset": 2048, 00:15:50.270 "data_size": 63488 00:15:50.270 }, 00:15:50.270 { 00:15:50.270 "name": "BaseBdev2", 00:15:50.270 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:50.270 "is_configured": true, 00:15:50.270 "data_offset": 2048, 00:15:50.270 "data_size": 63488 00:15:50.270 }, 00:15:50.270 { 00:15:50.270 "name": "BaseBdev3", 00:15:50.270 "uuid": "3af5e321-c2b6-46eb-9e58-0ff024d89866", 00:15:50.270 "is_configured": true, 00:15:50.270 "data_offset": 2048, 00:15:50.270 "data_size": 63488 00:15:50.270 } 00:15:50.270 ] 00:15:50.270 } 00:15:50.270 } 00:15:50.270 }' 00:15:50.270 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:50.529 BaseBdev2 00:15:50.529 BaseBdev3' 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:50.529 "name": "BaseBdev1", 00:15:50.529 "aliases": [ 00:15:50.529 "75c11989-f3ae-450a-ba93-ebea1e3ba6bd" 00:15:50.529 ], 00:15:50.529 "product_name": "Malloc disk", 00:15:50.529 "block_size": 512, 00:15:50.529 "num_blocks": 65536, 00:15:50.529 "uuid": "75c11989-f3ae-450a-ba93-ebea1e3ba6bd", 00:15:50.529 "assigned_rate_limits": { 00:15:50.529 "rw_ios_per_sec": 0, 00:15:50.529 "rw_mbytes_per_sec": 0, 00:15:50.529 "r_mbytes_per_sec": 0, 00:15:50.529 "w_mbytes_per_sec": 0 00:15:50.529 }, 00:15:50.529 "claimed": true, 00:15:50.529 "claim_type": "exclusive_write", 00:15:50.529 "zoned": false, 00:15:50.529 "supported_io_types": { 00:15:50.529 "read": true, 00:15:50.529 "write": true, 00:15:50.529 "unmap": true, 00:15:50.529 "flush": true, 00:15:50.529 "reset": true, 00:15:50.529 "nvme_admin": false, 00:15:50.529 "nvme_io": false, 00:15:50.529 "nvme_io_md": false, 00:15:50.529 "write_zeroes": true, 00:15:50.529 "zcopy": true, 00:15:50.529 "get_zone_info": false, 00:15:50.529 "zone_management": false, 00:15:50.529 "zone_append": false, 00:15:50.529 "compare": false, 00:15:50.529 "compare_and_write": false, 00:15:50.529 "abort": true, 00:15:50.529 "seek_hole": false, 00:15:50.529 "seek_data": false, 00:15:50.529 "copy": true, 00:15:50.529 "nvme_iov_md": false 00:15:50.529 }, 00:15:50.529 "memory_domains": [ 00:15:50.529 { 00:15:50.529 "dma_device_id": "system", 00:15:50.529 "dma_device_type": 1 00:15:50.529 }, 00:15:50.529 { 00:15:50.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.529 "dma_device_type": 2 00:15:50.529 } 00:15:50.529 ], 00:15:50.529 "driver_specific": {} 00:15:50.529 }' 00:15:50.529 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.814 15:52:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.814 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.089 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.089 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.089 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:51.089 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.089 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.089 "name": "BaseBdev2", 00:15:51.090 "aliases": [ 00:15:51.090 "51b17af7-21cf-45e5-94a7-805c01028847" 00:15:51.090 ], 00:15:51.090 "product_name": "Malloc disk", 00:15:51.090 "block_size": 512, 00:15:51.090 "num_blocks": 65536, 00:15:51.090 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:51.090 "assigned_rate_limits": { 00:15:51.090 "rw_ios_per_sec": 0, 00:15:51.090 "rw_mbytes_per_sec": 0, 00:15:51.090 "r_mbytes_per_sec": 0, 00:15:51.090 "w_mbytes_per_sec": 0 00:15:51.090 }, 00:15:51.090 "claimed": true, 00:15:51.090 "claim_type": "exclusive_write", 00:15:51.090 "zoned": false, 00:15:51.090 "supported_io_types": { 00:15:51.090 "read": true, 00:15:51.090 "write": true, 00:15:51.090 "unmap": true, 00:15:51.090 "flush": true, 00:15:51.090 "reset": true, 00:15:51.090 "nvme_admin": false, 00:15:51.090 "nvme_io": false, 00:15:51.090 "nvme_io_md": false, 00:15:51.090 "write_zeroes": true, 00:15:51.090 "zcopy": true, 00:15:51.090 "get_zone_info": false, 00:15:51.090 "zone_management": false, 00:15:51.090 "zone_append": false, 00:15:51.090 "compare": false, 00:15:51.090 "compare_and_write": false, 00:15:51.090 "abort": true, 00:15:51.090 "seek_hole": false, 00:15:51.090 "seek_data": false, 00:15:51.090 "copy": true, 00:15:51.090 "nvme_iov_md": false 00:15:51.090 }, 00:15:51.090 "memory_domains": [ 00:15:51.090 { 00:15:51.090 "dma_device_id": "system", 00:15:51.090 "dma_device_type": 1 00:15:51.090 }, 00:15:51.090 { 00:15:51.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.090 "dma_device_type": 2 00:15:51.090 } 00:15:51.090 ], 00:15:51.090 "driver_specific": {} 00:15:51.090 }' 00:15:51.090 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.090 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.350 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.609 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.609 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.609 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.609 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:51.609 15:52:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.609 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.609 "name": "BaseBdev3", 00:15:51.609 "aliases": [ 00:15:51.609 "3af5e321-c2b6-46eb-9e58-0ff024d89866" 00:15:51.609 ], 00:15:51.609 "product_name": "Malloc disk", 00:15:51.609 "block_size": 512, 00:15:51.609 "num_blocks": 65536, 00:15:51.609 "uuid": "3af5e321-c2b6-46eb-9e58-0ff024d89866", 00:15:51.609 "assigned_rate_limits": { 00:15:51.609 "rw_ios_per_sec": 0, 00:15:51.609 "rw_mbytes_per_sec": 0, 00:15:51.609 "r_mbytes_per_sec": 0, 00:15:51.609 "w_mbytes_per_sec": 0 00:15:51.609 }, 00:15:51.609 "claimed": true, 00:15:51.609 "claim_type": "exclusive_write", 00:15:51.609 "zoned": false, 00:15:51.609 "supported_io_types": { 00:15:51.609 "read": true, 00:15:51.609 "write": true, 00:15:51.609 "unmap": true, 00:15:51.609 "flush": true, 00:15:51.609 "reset": true, 00:15:51.609 "nvme_admin": false, 00:15:51.609 "nvme_io": false, 00:15:51.609 "nvme_io_md": false, 00:15:51.609 "write_zeroes": true, 00:15:51.609 "zcopy": true, 00:15:51.609 "get_zone_info": false, 00:15:51.609 "zone_management": false, 00:15:51.609 "zone_append": false, 00:15:51.609 "compare": false, 00:15:51.609 "compare_and_write": false, 00:15:51.609 "abort": true, 00:15:51.609 "seek_hole": false, 00:15:51.609 "seek_data": false, 00:15:51.609 "copy": true, 00:15:51.609 "nvme_iov_md": false 00:15:51.609 }, 00:15:51.609 "memory_domains": [ 00:15:51.609 { 00:15:51.609 "dma_device_id": "system", 00:15:51.609 "dma_device_type": 1 00:15:51.609 }, 00:15:51.609 { 00:15:51.609 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.609 "dma_device_type": 2 00:15:51.609 } 00:15:51.609 ], 00:15:51.609 "driver_specific": {} 00:15:51.609 }' 00:15:51.609 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.867 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:52.126 [2024-07-12 15:52:12.544978] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:52.126 [2024-07-12 15:52:12.544996] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:52.126 [2024-07-12 15:52:12.545028] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.126 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:52.386 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:52.387 "name": "Existed_Raid", 00:15:52.387 "uuid": "b4fead49-5890-4041-98a9-3638c0bc33e7", 00:15:52.387 "strip_size_kb": 64, 00:15:52.387 "state": "offline", 00:15:52.387 "raid_level": "concat", 00:15:52.387 "superblock": true, 00:15:52.387 "num_base_bdevs": 3, 00:15:52.387 "num_base_bdevs_discovered": 2, 00:15:52.387 "num_base_bdevs_operational": 2, 00:15:52.387 "base_bdevs_list": [ 00:15:52.387 { 00:15:52.387 "name": null, 00:15:52.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:52.387 "is_configured": false, 00:15:52.387 "data_offset": 2048, 00:15:52.387 "data_size": 63488 00:15:52.387 }, 00:15:52.387 { 00:15:52.387 "name": "BaseBdev2", 00:15:52.387 "uuid": "51b17af7-21cf-45e5-94a7-805c01028847", 00:15:52.387 "is_configured": true, 00:15:52.387 "data_offset": 2048, 00:15:52.387 "data_size": 63488 00:15:52.387 }, 00:15:52.387 { 00:15:52.387 "name": "BaseBdev3", 00:15:52.387 "uuid": "3af5e321-c2b6-46eb-9e58-0ff024d89866", 00:15:52.387 "is_configured": true, 00:15:52.387 "data_offset": 2048, 00:15:52.387 "data_size": 63488 00:15:52.387 } 00:15:52.387 ] 00:15:52.387 }' 00:15:52.387 15:52:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:52.387 15:52:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:52.955 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:52.955 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:52.955 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:52.955 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.214 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:53.214 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:53.214 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:53.473 [2024-07-12 15:52:13.683862] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:53.473 15:52:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:53.733 [2024-07-12 15:52:14.050748] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:53.734 [2024-07-12 15:52:14.050779] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c53280 name Existed_Raid, state offline 00:15:53.734 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:53.734 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:53.734 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.734 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:53.993 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:54.253 BaseBdev2 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:54.253 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:54.513 [ 00:15:54.513 { 00:15:54.513 "name": "BaseBdev2", 00:15:54.513 "aliases": [ 00:15:54.513 "acc206ae-4161-402d-894d-0b06f261c0cc" 00:15:54.513 ], 00:15:54.513 "product_name": "Malloc disk", 00:15:54.513 "block_size": 512, 00:15:54.513 "num_blocks": 65536, 00:15:54.513 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:15:54.513 "assigned_rate_limits": { 00:15:54.513 "rw_ios_per_sec": 0, 00:15:54.513 "rw_mbytes_per_sec": 0, 00:15:54.513 "r_mbytes_per_sec": 0, 00:15:54.513 "w_mbytes_per_sec": 0 00:15:54.513 }, 00:15:54.513 "claimed": false, 00:15:54.513 "zoned": false, 00:15:54.513 "supported_io_types": { 00:15:54.513 "read": true, 00:15:54.513 "write": true, 00:15:54.513 "unmap": true, 00:15:54.513 "flush": true, 00:15:54.513 "reset": true, 00:15:54.513 "nvme_admin": false, 00:15:54.513 "nvme_io": false, 00:15:54.513 "nvme_io_md": false, 00:15:54.513 "write_zeroes": true, 00:15:54.513 "zcopy": true, 00:15:54.513 "get_zone_info": false, 00:15:54.513 "zone_management": false, 00:15:54.513 "zone_append": false, 00:15:54.513 "compare": false, 00:15:54.513 "compare_and_write": false, 00:15:54.513 "abort": true, 00:15:54.513 "seek_hole": false, 00:15:54.513 "seek_data": false, 00:15:54.513 "copy": true, 00:15:54.513 "nvme_iov_md": false 00:15:54.513 }, 00:15:54.513 "memory_domains": [ 00:15:54.513 { 00:15:54.513 "dma_device_id": "system", 00:15:54.513 "dma_device_type": 1 00:15:54.513 }, 00:15:54.513 { 00:15:54.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:54.513 "dma_device_type": 2 00:15:54.513 } 00:15:54.513 ], 00:15:54.513 "driver_specific": {} 00:15:54.513 } 00:15:54.513 ] 00:15:54.513 15:52:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:54.513 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:54.513 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:54.513 15:52:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:54.772 BaseBdev3 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:54.772 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:55.032 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:55.032 [ 00:15:55.032 { 00:15:55.032 "name": "BaseBdev3", 00:15:55.032 "aliases": [ 00:15:55.032 "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7" 00:15:55.032 ], 00:15:55.032 "product_name": "Malloc disk", 00:15:55.032 "block_size": 512, 00:15:55.032 "num_blocks": 65536, 00:15:55.032 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:15:55.032 "assigned_rate_limits": { 00:15:55.032 "rw_ios_per_sec": 0, 00:15:55.032 "rw_mbytes_per_sec": 0, 00:15:55.032 "r_mbytes_per_sec": 0, 00:15:55.032 "w_mbytes_per_sec": 0 00:15:55.032 }, 00:15:55.032 "claimed": false, 00:15:55.032 "zoned": false, 00:15:55.032 "supported_io_types": { 00:15:55.032 "read": true, 00:15:55.032 "write": true, 00:15:55.032 "unmap": true, 00:15:55.032 "flush": true, 00:15:55.032 "reset": true, 00:15:55.032 "nvme_admin": false, 00:15:55.032 "nvme_io": false, 00:15:55.032 "nvme_io_md": false, 00:15:55.032 "write_zeroes": true, 00:15:55.032 "zcopy": true, 00:15:55.032 "get_zone_info": false, 00:15:55.032 "zone_management": false, 00:15:55.032 "zone_append": false, 00:15:55.032 "compare": false, 00:15:55.032 "compare_and_write": false, 00:15:55.032 "abort": true, 00:15:55.032 "seek_hole": false, 00:15:55.032 "seek_data": false, 00:15:55.032 "copy": true, 00:15:55.032 "nvme_iov_md": false 00:15:55.032 }, 00:15:55.032 "memory_domains": [ 00:15:55.032 { 00:15:55.032 "dma_device_id": "system", 00:15:55.032 "dma_device_type": 1 00:15:55.032 }, 00:15:55.032 { 00:15:55.032 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:55.032 "dma_device_type": 2 00:15:55.032 } 00:15:55.032 ], 00:15:55.032 "driver_specific": {} 00:15:55.032 } 00:15:55.032 ] 00:15:55.032 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:55.032 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:55.032 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:55.032 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:55.292 [2024-07-12 15:52:15.590573] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:55.292 [2024-07-12 15:52:15.590601] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:55.292 [2024-07-12 15:52:15.590615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:55.292 [2024-07-12 15:52:15.591651] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.292 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:55.552 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.552 "name": "Existed_Raid", 00:15:55.552 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:15:55.552 "strip_size_kb": 64, 00:15:55.552 "state": "configuring", 00:15:55.552 "raid_level": "concat", 00:15:55.552 "superblock": true, 00:15:55.552 "num_base_bdevs": 3, 00:15:55.552 "num_base_bdevs_discovered": 2, 00:15:55.552 "num_base_bdevs_operational": 3, 00:15:55.552 "base_bdevs_list": [ 00:15:55.552 { 00:15:55.552 "name": "BaseBdev1", 00:15:55.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:55.552 "is_configured": false, 00:15:55.552 "data_offset": 0, 00:15:55.552 "data_size": 0 00:15:55.552 }, 00:15:55.552 { 00:15:55.552 "name": "BaseBdev2", 00:15:55.552 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:15:55.552 "is_configured": true, 00:15:55.552 "data_offset": 2048, 00:15:55.552 "data_size": 63488 00:15:55.552 }, 00:15:55.552 { 00:15:55.552 "name": "BaseBdev3", 00:15:55.552 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:15:55.552 "is_configured": true, 00:15:55.552 "data_offset": 2048, 00:15:55.552 "data_size": 63488 00:15:55.552 } 00:15:55.552 ] 00:15:55.552 }' 00:15:55.552 15:52:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.552 15:52:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:56.121 [2024-07-12 15:52:16.524920] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:56.121 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.382 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.382 "name": "Existed_Raid", 00:15:56.382 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:15:56.382 "strip_size_kb": 64, 00:15:56.382 "state": "configuring", 00:15:56.382 "raid_level": "concat", 00:15:56.382 "superblock": true, 00:15:56.382 "num_base_bdevs": 3, 00:15:56.382 "num_base_bdevs_discovered": 1, 00:15:56.382 "num_base_bdevs_operational": 3, 00:15:56.382 "base_bdevs_list": [ 00:15:56.382 { 00:15:56.382 "name": "BaseBdev1", 00:15:56.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:56.382 "is_configured": false, 00:15:56.382 "data_offset": 0, 00:15:56.382 "data_size": 0 00:15:56.382 }, 00:15:56.382 { 00:15:56.382 "name": null, 00:15:56.382 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:15:56.382 "is_configured": false, 00:15:56.382 "data_offset": 2048, 00:15:56.382 "data_size": 63488 00:15:56.382 }, 00:15:56.382 { 00:15:56.382 "name": "BaseBdev3", 00:15:56.382 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:15:56.382 "is_configured": true, 00:15:56.382 "data_offset": 2048, 00:15:56.382 "data_size": 63488 00:15:56.382 } 00:15:56.382 ] 00:15:56.382 }' 00:15:56.382 15:52:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.382 15:52:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:57.326 15:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:57.326 15:52:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:57.895 15:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:57.895 15:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:58.155 [2024-07-12 15:52:18.406718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:58.155 BaseBdev1 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:58.155 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:58.722 15:52:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:58.722 [ 00:15:58.722 { 00:15:58.722 "name": "BaseBdev1", 00:15:58.722 "aliases": [ 00:15:58.722 "6d2eb587-49e7-4e46-bcd9-023e84342b22" 00:15:58.722 ], 00:15:58.722 "product_name": "Malloc disk", 00:15:58.722 "block_size": 512, 00:15:58.722 "num_blocks": 65536, 00:15:58.722 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:15:58.722 "assigned_rate_limits": { 00:15:58.722 "rw_ios_per_sec": 0, 00:15:58.722 "rw_mbytes_per_sec": 0, 00:15:58.722 "r_mbytes_per_sec": 0, 00:15:58.722 "w_mbytes_per_sec": 0 00:15:58.722 }, 00:15:58.722 "claimed": true, 00:15:58.722 "claim_type": "exclusive_write", 00:15:58.722 "zoned": false, 00:15:58.722 "supported_io_types": { 00:15:58.722 "read": true, 00:15:58.722 "write": true, 00:15:58.722 "unmap": true, 00:15:58.722 "flush": true, 00:15:58.722 "reset": true, 00:15:58.722 "nvme_admin": false, 00:15:58.722 "nvme_io": false, 00:15:58.722 "nvme_io_md": false, 00:15:58.722 "write_zeroes": true, 00:15:58.722 "zcopy": true, 00:15:58.722 "get_zone_info": false, 00:15:58.722 "zone_management": false, 00:15:58.722 "zone_append": false, 00:15:58.722 "compare": false, 00:15:58.722 "compare_and_write": false, 00:15:58.722 "abort": true, 00:15:58.722 "seek_hole": false, 00:15:58.722 "seek_data": false, 00:15:58.722 "copy": true, 00:15:58.722 "nvme_iov_md": false 00:15:58.722 }, 00:15:58.722 "memory_domains": [ 00:15:58.722 { 00:15:58.722 "dma_device_id": "system", 00:15:58.722 "dma_device_type": 1 00:15:58.722 }, 00:15:58.722 { 00:15:58.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.722 "dma_device_type": 2 00:15:58.722 } 00:15:58.722 ], 00:15:58.722 "driver_specific": {} 00:15:58.722 } 00:15:58.722 ] 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.983 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.554 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.554 "name": "Existed_Raid", 00:15:59.554 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:15:59.554 "strip_size_kb": 64, 00:15:59.554 "state": "configuring", 00:15:59.554 "raid_level": "concat", 00:15:59.554 "superblock": true, 00:15:59.554 "num_base_bdevs": 3, 00:15:59.554 "num_base_bdevs_discovered": 2, 00:15:59.554 "num_base_bdevs_operational": 3, 00:15:59.554 "base_bdevs_list": [ 00:15:59.554 { 00:15:59.554 "name": "BaseBdev1", 00:15:59.554 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:15:59.554 "is_configured": true, 00:15:59.554 "data_offset": 2048, 00:15:59.554 "data_size": 63488 00:15:59.554 }, 00:15:59.554 { 00:15:59.554 "name": null, 00:15:59.554 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:15:59.554 "is_configured": false, 00:15:59.554 "data_offset": 2048, 00:15:59.554 "data_size": 63488 00:15:59.554 }, 00:15:59.554 { 00:15:59.554 "name": "BaseBdev3", 00:15:59.554 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:15:59.554 "is_configured": true, 00:15:59.554 "data_offset": 2048, 00:15:59.554 "data_size": 63488 00:15:59.554 } 00:15:59.554 ] 00:15:59.554 }' 00:15:59.554 15:52:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.554 15:52:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:00.494 15:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.494 15:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:00.494 15:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:00.494 15:52:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:00.754 [2024-07-12 15:52:21.105856] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.754 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.015 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.015 "name": "Existed_Raid", 00:16:01.015 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:01.015 "strip_size_kb": 64, 00:16:01.015 "state": "configuring", 00:16:01.015 "raid_level": "concat", 00:16:01.015 "superblock": true, 00:16:01.015 "num_base_bdevs": 3, 00:16:01.015 "num_base_bdevs_discovered": 1, 00:16:01.015 "num_base_bdevs_operational": 3, 00:16:01.015 "base_bdevs_list": [ 00:16:01.015 { 00:16:01.015 "name": "BaseBdev1", 00:16:01.015 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:01.015 "is_configured": true, 00:16:01.015 "data_offset": 2048, 00:16:01.015 "data_size": 63488 00:16:01.015 }, 00:16:01.015 { 00:16:01.015 "name": null, 00:16:01.015 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:01.015 "is_configured": false, 00:16:01.015 "data_offset": 2048, 00:16:01.015 "data_size": 63488 00:16:01.015 }, 00:16:01.015 { 00:16:01.015 "name": null, 00:16:01.015 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:01.015 "is_configured": false, 00:16:01.015 "data_offset": 2048, 00:16:01.015 "data_size": 63488 00:16:01.015 } 00:16:01.015 ] 00:16:01.015 }' 00:16:01.015 15:52:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.015 15:52:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:01.954 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.954 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:02.213 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:02.213 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:02.473 [2024-07-12 15:52:22.725966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:02.473 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:02.473 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.473 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.473 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.474 15:52:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.046 15:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.046 "name": "Existed_Raid", 00:16:03.046 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:03.046 "strip_size_kb": 64, 00:16:03.046 "state": "configuring", 00:16:03.046 "raid_level": "concat", 00:16:03.046 "superblock": true, 00:16:03.046 "num_base_bdevs": 3, 00:16:03.046 "num_base_bdevs_discovered": 2, 00:16:03.046 "num_base_bdevs_operational": 3, 00:16:03.046 "base_bdevs_list": [ 00:16:03.046 { 00:16:03.046 "name": "BaseBdev1", 00:16:03.046 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:03.046 "is_configured": true, 00:16:03.046 "data_offset": 2048, 00:16:03.046 "data_size": 63488 00:16:03.046 }, 00:16:03.046 { 00:16:03.046 "name": null, 00:16:03.046 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:03.046 "is_configured": false, 00:16:03.046 "data_offset": 2048, 00:16:03.046 "data_size": 63488 00:16:03.046 }, 00:16:03.046 { 00:16:03.046 "name": "BaseBdev3", 00:16:03.046 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:03.046 "is_configured": true, 00:16:03.046 "data_offset": 2048, 00:16:03.046 "data_size": 63488 00:16:03.046 } 00:16:03.046 ] 00:16:03.046 }' 00:16:03.046 15:52:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.046 15:52:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:03.985 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.985 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:04.245 [2024-07-12 15:52:24.646847] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.245 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.505 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.505 "name": "Existed_Raid", 00:16:04.505 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:04.505 "strip_size_kb": 64, 00:16:04.505 "state": "configuring", 00:16:04.505 "raid_level": "concat", 00:16:04.505 "superblock": true, 00:16:04.505 "num_base_bdevs": 3, 00:16:04.505 "num_base_bdevs_discovered": 1, 00:16:04.505 "num_base_bdevs_operational": 3, 00:16:04.505 "base_bdevs_list": [ 00:16:04.505 { 00:16:04.505 "name": null, 00:16:04.505 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:04.505 "is_configured": false, 00:16:04.505 "data_offset": 2048, 00:16:04.505 "data_size": 63488 00:16:04.505 }, 00:16:04.505 { 00:16:04.505 "name": null, 00:16:04.505 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:04.505 "is_configured": false, 00:16:04.505 "data_offset": 2048, 00:16:04.505 "data_size": 63488 00:16:04.505 }, 00:16:04.505 { 00:16:04.505 "name": "BaseBdev3", 00:16:04.505 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:04.505 "is_configured": true, 00:16:04.505 "data_offset": 2048, 00:16:04.505 "data_size": 63488 00:16:04.505 } 00:16:04.505 ] 00:16:04.505 }' 00:16:04.505 15:52:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.505 15:52:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:05.442 15:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.442 15:52:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:05.700 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:05.700 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:05.959 [2024-07-12 15:52:26.196612] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:05.959 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.219 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.219 "name": "Existed_Raid", 00:16:06.219 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:06.219 "strip_size_kb": 64, 00:16:06.219 "state": "configuring", 00:16:06.219 "raid_level": "concat", 00:16:06.219 "superblock": true, 00:16:06.219 "num_base_bdevs": 3, 00:16:06.219 "num_base_bdevs_discovered": 2, 00:16:06.219 "num_base_bdevs_operational": 3, 00:16:06.219 "base_bdevs_list": [ 00:16:06.219 { 00:16:06.219 "name": null, 00:16:06.219 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:06.219 "is_configured": false, 00:16:06.219 "data_offset": 2048, 00:16:06.219 "data_size": 63488 00:16:06.219 }, 00:16:06.219 { 00:16:06.219 "name": "BaseBdev2", 00:16:06.219 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:06.219 "is_configured": true, 00:16:06.219 "data_offset": 2048, 00:16:06.219 "data_size": 63488 00:16:06.219 }, 00:16:06.219 { 00:16:06.219 "name": "BaseBdev3", 00:16:06.219 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:06.219 "is_configured": true, 00:16:06.219 "data_offset": 2048, 00:16:06.219 "data_size": 63488 00:16:06.219 } 00:16:06.219 ] 00:16:06.219 }' 00:16:06.219 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.219 15:52:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:06.788 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.788 15:52:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:06.788 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:06.788 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.788 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:07.047 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6d2eb587-49e7-4e46-bcd9-023e84342b22 00:16:07.306 [2024-07-12 15:52:27.512938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:07.306 [2024-07-12 15:52:27.513048] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c53a10 00:16:07.306 [2024-07-12 15:52:27.513055] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:07.306 [2024-07-12 15:52:27.513189] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c54230 00:16:07.306 [2024-07-12 15:52:27.513279] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c53a10 00:16:07.306 [2024-07-12 15:52:27.513288] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c53a10 00:16:07.306 [2024-07-12 15:52:27.513358] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:07.306 NewBaseBdev 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:07.306 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:07.565 [ 00:16:07.565 { 00:16:07.565 "name": "NewBaseBdev", 00:16:07.565 "aliases": [ 00:16:07.565 "6d2eb587-49e7-4e46-bcd9-023e84342b22" 00:16:07.565 ], 00:16:07.565 "product_name": "Malloc disk", 00:16:07.565 "block_size": 512, 00:16:07.565 "num_blocks": 65536, 00:16:07.565 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:07.565 "assigned_rate_limits": { 00:16:07.566 "rw_ios_per_sec": 0, 00:16:07.566 "rw_mbytes_per_sec": 0, 00:16:07.566 "r_mbytes_per_sec": 0, 00:16:07.566 "w_mbytes_per_sec": 0 00:16:07.566 }, 00:16:07.566 "claimed": true, 00:16:07.566 "claim_type": "exclusive_write", 00:16:07.566 "zoned": false, 00:16:07.566 "supported_io_types": { 00:16:07.566 "read": true, 00:16:07.566 "write": true, 00:16:07.566 "unmap": true, 00:16:07.566 "flush": true, 00:16:07.566 "reset": true, 00:16:07.566 "nvme_admin": false, 00:16:07.566 "nvme_io": false, 00:16:07.566 "nvme_io_md": false, 00:16:07.566 "write_zeroes": true, 00:16:07.566 "zcopy": true, 00:16:07.566 "get_zone_info": false, 00:16:07.566 "zone_management": false, 00:16:07.566 "zone_append": false, 00:16:07.566 "compare": false, 00:16:07.566 "compare_and_write": false, 00:16:07.566 "abort": true, 00:16:07.566 "seek_hole": false, 00:16:07.566 "seek_data": false, 00:16:07.566 "copy": true, 00:16:07.566 "nvme_iov_md": false 00:16:07.566 }, 00:16:07.566 "memory_domains": [ 00:16:07.566 { 00:16:07.566 "dma_device_id": "system", 00:16:07.566 "dma_device_type": 1 00:16:07.566 }, 00:16:07.566 { 00:16:07.566 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.566 "dma_device_type": 2 00:16:07.566 } 00:16:07.566 ], 00:16:07.566 "driver_specific": {} 00:16:07.566 } 00:16:07.566 ] 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:07.566 15:52:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:07.848 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:07.848 "name": "Existed_Raid", 00:16:07.848 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:07.848 "strip_size_kb": 64, 00:16:07.848 "state": "online", 00:16:07.848 "raid_level": "concat", 00:16:07.848 "superblock": true, 00:16:07.848 "num_base_bdevs": 3, 00:16:07.848 "num_base_bdevs_discovered": 3, 00:16:07.848 "num_base_bdevs_operational": 3, 00:16:07.848 "base_bdevs_list": [ 00:16:07.848 { 00:16:07.848 "name": "NewBaseBdev", 00:16:07.848 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:07.848 "is_configured": true, 00:16:07.848 "data_offset": 2048, 00:16:07.848 "data_size": 63488 00:16:07.848 }, 00:16:07.848 { 00:16:07.848 "name": "BaseBdev2", 00:16:07.848 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:07.848 "is_configured": true, 00:16:07.848 "data_offset": 2048, 00:16:07.848 "data_size": 63488 00:16:07.848 }, 00:16:07.848 { 00:16:07.848 "name": "BaseBdev3", 00:16:07.848 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:07.848 "is_configured": true, 00:16:07.848 "data_offset": 2048, 00:16:07.848 "data_size": 63488 00:16:07.848 } 00:16:07.848 ] 00:16:07.848 }' 00:16:07.848 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:07.848 15:52:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:08.436 [2024-07-12 15:52:28.836512] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:08.436 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:08.436 "name": "Existed_Raid", 00:16:08.437 "aliases": [ 00:16:08.437 "8bed6dff-f69a-450a-a7cb-3bfe2058b03d" 00:16:08.437 ], 00:16:08.437 "product_name": "Raid Volume", 00:16:08.437 "block_size": 512, 00:16:08.437 "num_blocks": 190464, 00:16:08.437 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:08.437 "assigned_rate_limits": { 00:16:08.437 "rw_ios_per_sec": 0, 00:16:08.437 "rw_mbytes_per_sec": 0, 00:16:08.437 "r_mbytes_per_sec": 0, 00:16:08.437 "w_mbytes_per_sec": 0 00:16:08.437 }, 00:16:08.437 "claimed": false, 00:16:08.437 "zoned": false, 00:16:08.437 "supported_io_types": { 00:16:08.437 "read": true, 00:16:08.437 "write": true, 00:16:08.437 "unmap": true, 00:16:08.437 "flush": true, 00:16:08.437 "reset": true, 00:16:08.437 "nvme_admin": false, 00:16:08.437 "nvme_io": false, 00:16:08.437 "nvme_io_md": false, 00:16:08.437 "write_zeroes": true, 00:16:08.437 "zcopy": false, 00:16:08.437 "get_zone_info": false, 00:16:08.437 "zone_management": false, 00:16:08.437 "zone_append": false, 00:16:08.437 "compare": false, 00:16:08.437 "compare_and_write": false, 00:16:08.437 "abort": false, 00:16:08.437 "seek_hole": false, 00:16:08.437 "seek_data": false, 00:16:08.437 "copy": false, 00:16:08.437 "nvme_iov_md": false 00:16:08.437 }, 00:16:08.437 "memory_domains": [ 00:16:08.437 { 00:16:08.437 "dma_device_id": "system", 00:16:08.437 "dma_device_type": 1 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.437 "dma_device_type": 2 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "dma_device_id": "system", 00:16:08.437 "dma_device_type": 1 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.437 "dma_device_type": 2 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "dma_device_id": "system", 00:16:08.437 "dma_device_type": 1 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.437 "dma_device_type": 2 00:16:08.437 } 00:16:08.437 ], 00:16:08.437 "driver_specific": { 00:16:08.437 "raid": { 00:16:08.437 "uuid": "8bed6dff-f69a-450a-a7cb-3bfe2058b03d", 00:16:08.437 "strip_size_kb": 64, 00:16:08.437 "state": "online", 00:16:08.437 "raid_level": "concat", 00:16:08.437 "superblock": true, 00:16:08.437 "num_base_bdevs": 3, 00:16:08.437 "num_base_bdevs_discovered": 3, 00:16:08.437 "num_base_bdevs_operational": 3, 00:16:08.437 "base_bdevs_list": [ 00:16:08.437 { 00:16:08.437 "name": "NewBaseBdev", 00:16:08.437 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:08.437 "is_configured": true, 00:16:08.437 "data_offset": 2048, 00:16:08.437 "data_size": 63488 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "name": "BaseBdev2", 00:16:08.437 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:08.437 "is_configured": true, 00:16:08.437 "data_offset": 2048, 00:16:08.437 "data_size": 63488 00:16:08.437 }, 00:16:08.437 { 00:16:08.437 "name": "BaseBdev3", 00:16:08.437 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:08.437 "is_configured": true, 00:16:08.437 "data_offset": 2048, 00:16:08.437 "data_size": 63488 00:16:08.437 } 00:16:08.437 ] 00:16:08.437 } 00:16:08.437 } 00:16:08.437 }' 00:16:08.437 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:08.437 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:08.437 BaseBdev2 00:16:08.437 BaseBdev3' 00:16:08.437 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:08.437 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:08.437 15:52:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:08.696 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:08.696 "name": "NewBaseBdev", 00:16:08.696 "aliases": [ 00:16:08.696 "6d2eb587-49e7-4e46-bcd9-023e84342b22" 00:16:08.696 ], 00:16:08.696 "product_name": "Malloc disk", 00:16:08.696 "block_size": 512, 00:16:08.696 "num_blocks": 65536, 00:16:08.696 "uuid": "6d2eb587-49e7-4e46-bcd9-023e84342b22", 00:16:08.696 "assigned_rate_limits": { 00:16:08.696 "rw_ios_per_sec": 0, 00:16:08.696 "rw_mbytes_per_sec": 0, 00:16:08.696 "r_mbytes_per_sec": 0, 00:16:08.696 "w_mbytes_per_sec": 0 00:16:08.696 }, 00:16:08.696 "claimed": true, 00:16:08.696 "claim_type": "exclusive_write", 00:16:08.696 "zoned": false, 00:16:08.696 "supported_io_types": { 00:16:08.696 "read": true, 00:16:08.696 "write": true, 00:16:08.696 "unmap": true, 00:16:08.697 "flush": true, 00:16:08.697 "reset": true, 00:16:08.697 "nvme_admin": false, 00:16:08.697 "nvme_io": false, 00:16:08.697 "nvme_io_md": false, 00:16:08.697 "write_zeroes": true, 00:16:08.697 "zcopy": true, 00:16:08.697 "get_zone_info": false, 00:16:08.697 "zone_management": false, 00:16:08.697 "zone_append": false, 00:16:08.697 "compare": false, 00:16:08.697 "compare_and_write": false, 00:16:08.697 "abort": true, 00:16:08.697 "seek_hole": false, 00:16:08.697 "seek_data": false, 00:16:08.697 "copy": true, 00:16:08.697 "nvme_iov_md": false 00:16:08.697 }, 00:16:08.697 "memory_domains": [ 00:16:08.697 { 00:16:08.697 "dma_device_id": "system", 00:16:08.697 "dma_device_type": 1 00:16:08.697 }, 00:16:08.697 { 00:16:08.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:08.697 "dma_device_type": 2 00:16:08.697 } 00:16:08.697 ], 00:16:08.697 "driver_specific": {} 00:16:08.697 }' 00:16:08.697 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.697 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:08.956 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.215 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:09.215 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:09.215 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:09.215 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:09.215 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:09.215 "name": "BaseBdev2", 00:16:09.215 "aliases": [ 00:16:09.215 "acc206ae-4161-402d-894d-0b06f261c0cc" 00:16:09.215 ], 00:16:09.215 "product_name": "Malloc disk", 00:16:09.215 "block_size": 512, 00:16:09.215 "num_blocks": 65536, 00:16:09.215 "uuid": "acc206ae-4161-402d-894d-0b06f261c0cc", 00:16:09.215 "assigned_rate_limits": { 00:16:09.215 "rw_ios_per_sec": 0, 00:16:09.215 "rw_mbytes_per_sec": 0, 00:16:09.215 "r_mbytes_per_sec": 0, 00:16:09.215 "w_mbytes_per_sec": 0 00:16:09.215 }, 00:16:09.215 "claimed": true, 00:16:09.215 "claim_type": "exclusive_write", 00:16:09.215 "zoned": false, 00:16:09.215 "supported_io_types": { 00:16:09.215 "read": true, 00:16:09.215 "write": true, 00:16:09.215 "unmap": true, 00:16:09.216 "flush": true, 00:16:09.216 "reset": true, 00:16:09.216 "nvme_admin": false, 00:16:09.216 "nvme_io": false, 00:16:09.216 "nvme_io_md": false, 00:16:09.216 "write_zeroes": true, 00:16:09.216 "zcopy": true, 00:16:09.216 "get_zone_info": false, 00:16:09.216 "zone_management": false, 00:16:09.216 "zone_append": false, 00:16:09.216 "compare": false, 00:16:09.216 "compare_and_write": false, 00:16:09.216 "abort": true, 00:16:09.216 "seek_hole": false, 00:16:09.216 "seek_data": false, 00:16:09.216 "copy": true, 00:16:09.216 "nvme_iov_md": false 00:16:09.216 }, 00:16:09.216 "memory_domains": [ 00:16:09.216 { 00:16:09.216 "dma_device_id": "system", 00:16:09.216 "dma_device_type": 1 00:16:09.216 }, 00:16:09.216 { 00:16:09.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.216 "dma_device_type": 2 00:16:09.216 } 00:16:09.216 ], 00:16:09.216 "driver_specific": {} 00:16:09.216 }' 00:16:09.216 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:09.475 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.735 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:09.735 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:09.735 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:09.735 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:09.735 15:52:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:09.995 "name": "BaseBdev3", 00:16:09.995 "aliases": [ 00:16:09.995 "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7" 00:16:09.995 ], 00:16:09.995 "product_name": "Malloc disk", 00:16:09.995 "block_size": 512, 00:16:09.995 "num_blocks": 65536, 00:16:09.995 "uuid": "fa6aa05c-a235-41ef-b7e4-46af27b2b9c7", 00:16:09.995 "assigned_rate_limits": { 00:16:09.995 "rw_ios_per_sec": 0, 00:16:09.995 "rw_mbytes_per_sec": 0, 00:16:09.995 "r_mbytes_per_sec": 0, 00:16:09.995 "w_mbytes_per_sec": 0 00:16:09.995 }, 00:16:09.995 "claimed": true, 00:16:09.995 "claim_type": "exclusive_write", 00:16:09.995 "zoned": false, 00:16:09.995 "supported_io_types": { 00:16:09.995 "read": true, 00:16:09.995 "write": true, 00:16:09.995 "unmap": true, 00:16:09.995 "flush": true, 00:16:09.995 "reset": true, 00:16:09.995 "nvme_admin": false, 00:16:09.995 "nvme_io": false, 00:16:09.995 "nvme_io_md": false, 00:16:09.995 "write_zeroes": true, 00:16:09.995 "zcopy": true, 00:16:09.995 "get_zone_info": false, 00:16:09.995 "zone_management": false, 00:16:09.995 "zone_append": false, 00:16:09.995 "compare": false, 00:16:09.995 "compare_and_write": false, 00:16:09.995 "abort": true, 00:16:09.995 "seek_hole": false, 00:16:09.995 "seek_data": false, 00:16:09.995 "copy": true, 00:16:09.995 "nvme_iov_md": false 00:16:09.995 }, 00:16:09.995 "memory_domains": [ 00:16:09.995 { 00:16:09.995 "dma_device_id": "system", 00:16:09.995 "dma_device_type": 1 00:16:09.995 }, 00:16:09.995 { 00:16:09.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:09.995 "dma_device_type": 2 00:16:09.995 } 00:16:09.995 ], 00:16:09.995 "driver_specific": {} 00:16:09.995 }' 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:09.995 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:10.255 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:10.255 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.255 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:10.255 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:10.255 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:10.515 [2024-07-12 15:52:30.713045] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:10.515 [2024-07-12 15:52:30.713062] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:10.515 [2024-07-12 15:52:30.713098] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:10.515 [2024-07-12 15:52:30.713136] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:10.515 [2024-07-12 15:52:30.713142] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c53a10 name Existed_Raid, state offline 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2542223 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2542223 ']' 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2542223 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2542223 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2542223' 00:16:10.515 killing process with pid 2542223 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2542223 00:16:10.515 [2024-07-12 15:52:30.779084] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2542223 00:16:10.515 [2024-07-12 15:52:30.793809] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:10.515 00:16:10.515 real 0m27.669s 00:16:10.515 user 0m52.209s 00:16:10.515 sys 0m3.730s 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:10.515 15:52:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:10.515 ************************************ 00:16:10.515 END TEST raid_state_function_test_sb 00:16:10.515 ************************************ 00:16:10.515 15:52:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:10.515 15:52:30 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:16:10.515 15:52:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:16:10.515 15:52:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:10.515 15:52:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:10.774 ************************************ 00:16:10.775 START TEST raid_superblock_test 00:16:10.775 ************************************ 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2547419 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2547419 /var/tmp/spdk-raid.sock 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2547419 ']' 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:10.775 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:10.775 15:52:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:10.775 [2024-07-12 15:52:31.047484] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:16:10.775 [2024-07-12 15:52:31.047538] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2547419 ] 00:16:10.775 [2024-07-12 15:52:31.138187] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:10.775 [2024-07-12 15:52:31.213045] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:11.034 [2024-07-12 15:52:31.252820] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:11.034 [2024-07-12 15:52:31.252844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:11.293 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:11.293 malloc1 00:16:11.553 15:52:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:11.812 [2024-07-12 15:52:32.247910] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:11.812 [2024-07-12 15:52:32.247944] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:11.812 [2024-07-12 15:52:32.247956] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d82b50 00:16:11.812 [2024-07-12 15:52:32.247962] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:11.812 [2024-07-12 15:52:32.249265] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:11.812 [2024-07-12 15:52:32.249284] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:11.812 pt1 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:12.072 malloc2 00:16:12.072 15:52:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:12.641 [2024-07-12 15:52:32.983759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:12.641 [2024-07-12 15:52:32.983788] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:12.641 [2024-07-12 15:52:32.983798] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d83df0 00:16:12.641 [2024-07-12 15:52:32.983804] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:12.641 [2024-07-12 15:52:32.984995] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:12.641 [2024-07-12 15:52:32.985014] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:12.641 pt2 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:12.641 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:12.900 malloc3 00:16:12.900 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:13.468 [2024-07-12 15:52:33.719507] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:13.468 [2024-07-12 15:52:33.719536] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:13.468 [2024-07-12 15:52:33.719545] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d83770 00:16:13.468 [2024-07-12 15:52:33.719551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:13.468 [2024-07-12 15:52:33.720743] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:13.468 [2024-07-12 15:52:33.720762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:13.468 pt3 00:16:13.468 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:16:13.468 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:16:13.468 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:13.468 [2024-07-12 15:52:33.895970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:13.468 [2024-07-12 15:52:33.896966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:13.468 [2024-07-12 15:52:33.897008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:13.468 [2024-07-12 15:52:33.897124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f28cb0 00:16:13.468 [2024-07-12 15:52:33.897131] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:13.468 [2024-07-12 15:52:33.897277] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d83600 00:16:13.468 [2024-07-12 15:52:33.897385] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f28cb0 00:16:13.468 [2024-07-12 15:52:33.897391] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f28cb0 00:16:13.468 [2024-07-12 15:52:33.897457] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.727 15:52:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:13.727 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.727 "name": "raid_bdev1", 00:16:13.727 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:13.727 "strip_size_kb": 64, 00:16:13.727 "state": "online", 00:16:13.727 "raid_level": "concat", 00:16:13.727 "superblock": true, 00:16:13.727 "num_base_bdevs": 3, 00:16:13.727 "num_base_bdevs_discovered": 3, 00:16:13.727 "num_base_bdevs_operational": 3, 00:16:13.727 "base_bdevs_list": [ 00:16:13.727 { 00:16:13.727 "name": "pt1", 00:16:13.727 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:13.727 "is_configured": true, 00:16:13.727 "data_offset": 2048, 00:16:13.727 "data_size": 63488 00:16:13.727 }, 00:16:13.727 { 00:16:13.727 "name": "pt2", 00:16:13.727 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:13.727 "is_configured": true, 00:16:13.727 "data_offset": 2048, 00:16:13.727 "data_size": 63488 00:16:13.727 }, 00:16:13.727 { 00:16:13.727 "name": "pt3", 00:16:13.727 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:13.727 "is_configured": true, 00:16:13.727 "data_offset": 2048, 00:16:13.727 "data_size": 63488 00:16:13.727 } 00:16:13.727 ] 00:16:13.727 }' 00:16:13.727 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.727 15:52:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:14.298 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:14.557 [2024-07-12 15:52:34.802456] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:14.557 "name": "raid_bdev1", 00:16:14.557 "aliases": [ 00:16:14.557 "219e7755-d64c-41c0-a6c3-5227877df989" 00:16:14.557 ], 00:16:14.557 "product_name": "Raid Volume", 00:16:14.557 "block_size": 512, 00:16:14.557 "num_blocks": 190464, 00:16:14.557 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:14.557 "assigned_rate_limits": { 00:16:14.557 "rw_ios_per_sec": 0, 00:16:14.557 "rw_mbytes_per_sec": 0, 00:16:14.557 "r_mbytes_per_sec": 0, 00:16:14.557 "w_mbytes_per_sec": 0 00:16:14.557 }, 00:16:14.557 "claimed": false, 00:16:14.557 "zoned": false, 00:16:14.557 "supported_io_types": { 00:16:14.557 "read": true, 00:16:14.557 "write": true, 00:16:14.557 "unmap": true, 00:16:14.557 "flush": true, 00:16:14.557 "reset": true, 00:16:14.557 "nvme_admin": false, 00:16:14.557 "nvme_io": false, 00:16:14.557 "nvme_io_md": false, 00:16:14.557 "write_zeroes": true, 00:16:14.557 "zcopy": false, 00:16:14.557 "get_zone_info": false, 00:16:14.557 "zone_management": false, 00:16:14.557 "zone_append": false, 00:16:14.557 "compare": false, 00:16:14.557 "compare_and_write": false, 00:16:14.557 "abort": false, 00:16:14.557 "seek_hole": false, 00:16:14.557 "seek_data": false, 00:16:14.557 "copy": false, 00:16:14.557 "nvme_iov_md": false 00:16:14.557 }, 00:16:14.557 "memory_domains": [ 00:16:14.557 { 00:16:14.557 "dma_device_id": "system", 00:16:14.557 "dma_device_type": 1 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.557 "dma_device_type": 2 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "dma_device_id": "system", 00:16:14.557 "dma_device_type": 1 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.557 "dma_device_type": 2 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "dma_device_id": "system", 00:16:14.557 "dma_device_type": 1 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.557 "dma_device_type": 2 00:16:14.557 } 00:16:14.557 ], 00:16:14.557 "driver_specific": { 00:16:14.557 "raid": { 00:16:14.557 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:14.557 "strip_size_kb": 64, 00:16:14.557 "state": "online", 00:16:14.557 "raid_level": "concat", 00:16:14.557 "superblock": true, 00:16:14.557 "num_base_bdevs": 3, 00:16:14.557 "num_base_bdevs_discovered": 3, 00:16:14.557 "num_base_bdevs_operational": 3, 00:16:14.557 "base_bdevs_list": [ 00:16:14.557 { 00:16:14.557 "name": "pt1", 00:16:14.557 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:14.557 "is_configured": true, 00:16:14.557 "data_offset": 2048, 00:16:14.557 "data_size": 63488 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "name": "pt2", 00:16:14.557 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:14.557 "is_configured": true, 00:16:14.557 "data_offset": 2048, 00:16:14.557 "data_size": 63488 00:16:14.557 }, 00:16:14.557 { 00:16:14.557 "name": "pt3", 00:16:14.557 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:14.557 "is_configured": true, 00:16:14.557 "data_offset": 2048, 00:16:14.557 "data_size": 63488 00:16:14.557 } 00:16:14.557 ] 00:16:14.557 } 00:16:14.557 } 00:16:14.557 }' 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:14.557 pt2 00:16:14.557 pt3' 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:14.557 15:52:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:14.817 "name": "pt1", 00:16:14.817 "aliases": [ 00:16:14.817 "00000000-0000-0000-0000-000000000001" 00:16:14.817 ], 00:16:14.817 "product_name": "passthru", 00:16:14.817 "block_size": 512, 00:16:14.817 "num_blocks": 65536, 00:16:14.817 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:14.817 "assigned_rate_limits": { 00:16:14.817 "rw_ios_per_sec": 0, 00:16:14.817 "rw_mbytes_per_sec": 0, 00:16:14.817 "r_mbytes_per_sec": 0, 00:16:14.817 "w_mbytes_per_sec": 0 00:16:14.817 }, 00:16:14.817 "claimed": true, 00:16:14.817 "claim_type": "exclusive_write", 00:16:14.817 "zoned": false, 00:16:14.817 "supported_io_types": { 00:16:14.817 "read": true, 00:16:14.817 "write": true, 00:16:14.817 "unmap": true, 00:16:14.817 "flush": true, 00:16:14.817 "reset": true, 00:16:14.817 "nvme_admin": false, 00:16:14.817 "nvme_io": false, 00:16:14.817 "nvme_io_md": false, 00:16:14.817 "write_zeroes": true, 00:16:14.817 "zcopy": true, 00:16:14.817 "get_zone_info": false, 00:16:14.817 "zone_management": false, 00:16:14.817 "zone_append": false, 00:16:14.817 "compare": false, 00:16:14.817 "compare_and_write": false, 00:16:14.817 "abort": true, 00:16:14.817 "seek_hole": false, 00:16:14.817 "seek_data": false, 00:16:14.817 "copy": true, 00:16:14.817 "nvme_iov_md": false 00:16:14.817 }, 00:16:14.817 "memory_domains": [ 00:16:14.817 { 00:16:14.817 "dma_device_id": "system", 00:16:14.817 "dma_device_type": 1 00:16:14.817 }, 00:16:14.817 { 00:16:14.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:14.817 "dma_device_type": 2 00:16:14.817 } 00:16:14.817 ], 00:16:14.817 "driver_specific": { 00:16:14.817 "passthru": { 00:16:14.817 "name": "pt1", 00:16:14.817 "base_bdev_name": "malloc1" 00:16:14.817 } 00:16:14.817 } 00:16:14.817 }' 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:14.817 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:15.077 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.337 "name": "pt2", 00:16:15.337 "aliases": [ 00:16:15.337 "00000000-0000-0000-0000-000000000002" 00:16:15.337 ], 00:16:15.337 "product_name": "passthru", 00:16:15.337 "block_size": 512, 00:16:15.337 "num_blocks": 65536, 00:16:15.337 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:15.337 "assigned_rate_limits": { 00:16:15.337 "rw_ios_per_sec": 0, 00:16:15.337 "rw_mbytes_per_sec": 0, 00:16:15.337 "r_mbytes_per_sec": 0, 00:16:15.337 "w_mbytes_per_sec": 0 00:16:15.337 }, 00:16:15.337 "claimed": true, 00:16:15.337 "claim_type": "exclusive_write", 00:16:15.337 "zoned": false, 00:16:15.337 "supported_io_types": { 00:16:15.337 "read": true, 00:16:15.337 "write": true, 00:16:15.337 "unmap": true, 00:16:15.337 "flush": true, 00:16:15.337 "reset": true, 00:16:15.337 "nvme_admin": false, 00:16:15.337 "nvme_io": false, 00:16:15.337 "nvme_io_md": false, 00:16:15.337 "write_zeroes": true, 00:16:15.337 "zcopy": true, 00:16:15.337 "get_zone_info": false, 00:16:15.337 "zone_management": false, 00:16:15.337 "zone_append": false, 00:16:15.337 "compare": false, 00:16:15.337 "compare_and_write": false, 00:16:15.337 "abort": true, 00:16:15.337 "seek_hole": false, 00:16:15.337 "seek_data": false, 00:16:15.337 "copy": true, 00:16:15.337 "nvme_iov_md": false 00:16:15.337 }, 00:16:15.337 "memory_domains": [ 00:16:15.337 { 00:16:15.337 "dma_device_id": "system", 00:16:15.337 "dma_device_type": 1 00:16:15.337 }, 00:16:15.337 { 00:16:15.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.337 "dma_device_type": 2 00:16:15.337 } 00:16:15.337 ], 00:16:15.337 "driver_specific": { 00:16:15.337 "passthru": { 00:16:15.337 "name": "pt2", 00:16:15.337 "base_bdev_name": "malloc2" 00:16:15.337 } 00:16:15.337 } 00:16:15.337 }' 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:15.337 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:15.597 15:52:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:15.856 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:15.856 "name": "pt3", 00:16:15.856 "aliases": [ 00:16:15.856 "00000000-0000-0000-0000-000000000003" 00:16:15.856 ], 00:16:15.856 "product_name": "passthru", 00:16:15.856 "block_size": 512, 00:16:15.856 "num_blocks": 65536, 00:16:15.856 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:15.856 "assigned_rate_limits": { 00:16:15.856 "rw_ios_per_sec": 0, 00:16:15.856 "rw_mbytes_per_sec": 0, 00:16:15.856 "r_mbytes_per_sec": 0, 00:16:15.856 "w_mbytes_per_sec": 0 00:16:15.856 }, 00:16:15.856 "claimed": true, 00:16:15.856 "claim_type": "exclusive_write", 00:16:15.856 "zoned": false, 00:16:15.856 "supported_io_types": { 00:16:15.856 "read": true, 00:16:15.856 "write": true, 00:16:15.856 "unmap": true, 00:16:15.856 "flush": true, 00:16:15.856 "reset": true, 00:16:15.856 "nvme_admin": false, 00:16:15.856 "nvme_io": false, 00:16:15.856 "nvme_io_md": false, 00:16:15.856 "write_zeroes": true, 00:16:15.856 "zcopy": true, 00:16:15.856 "get_zone_info": false, 00:16:15.856 "zone_management": false, 00:16:15.856 "zone_append": false, 00:16:15.856 "compare": false, 00:16:15.856 "compare_and_write": false, 00:16:15.857 "abort": true, 00:16:15.857 "seek_hole": false, 00:16:15.857 "seek_data": false, 00:16:15.857 "copy": true, 00:16:15.857 "nvme_iov_md": false 00:16:15.857 }, 00:16:15.857 "memory_domains": [ 00:16:15.857 { 00:16:15.857 "dma_device_id": "system", 00:16:15.857 "dma_device_type": 1 00:16:15.857 }, 00:16:15.857 { 00:16:15.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.857 "dma_device_type": 2 00:16:15.857 } 00:16:15.857 ], 00:16:15.857 "driver_specific": { 00:16:15.857 "passthru": { 00:16:15.857 "name": "pt3", 00:16:15.857 "base_bdev_name": "malloc3" 00:16:15.857 } 00:16:15.857 } 00:16:15.857 }' 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:15.857 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:16.116 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:16:16.375 [2024-07-12 15:52:36.627058] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:16.375 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=219e7755-d64c-41c0-a6c3-5227877df989 00:16:16.375 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 219e7755-d64c-41c0-a6c3-5227877df989 ']' 00:16:16.375 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:16.375 [2024-07-12 15:52:36.819322] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:16.375 [2024-07-12 15:52:36.819331] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:16.375 [2024-07-12 15:52:36.819366] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:16.375 [2024-07-12 15:52:36.819407] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:16.375 [2024-07-12 15:52:36.819413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f28cb0 name raid_bdev1, state offline 00:16:16.635 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.635 15:52:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:16:16.635 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:16:16.635 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:16:16.635 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:16.635 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:17.204 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:17.204 15:52:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:17.774 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:16:17.774 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:18.033 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:18.033 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:18.293 [2024-07-12 15:52:38.663987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:18.293 [2024-07-12 15:52:38.665099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:18.293 [2024-07-12 15:52:38.665132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:18.293 [2024-07-12 15:52:38.665166] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:18.293 [2024-07-12 15:52:38.665192] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:18.293 [2024-07-12 15:52:38.665206] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:18.293 [2024-07-12 15:52:38.665216] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:18.293 [2024-07-12 15:52:38.665222] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f34360 name raid_bdev1, state configuring 00:16:18.293 request: 00:16:18.293 { 00:16:18.293 "name": "raid_bdev1", 00:16:18.293 "raid_level": "concat", 00:16:18.293 "base_bdevs": [ 00:16:18.293 "malloc1", 00:16:18.293 "malloc2", 00:16:18.293 "malloc3" 00:16:18.293 ], 00:16:18.293 "strip_size_kb": 64, 00:16:18.293 "superblock": false, 00:16:18.293 "method": "bdev_raid_create", 00:16:18.293 "req_id": 1 00:16:18.293 } 00:16:18.293 Got JSON-RPC error response 00:16:18.293 response: 00:16:18.293 { 00:16:18.293 "code": -17, 00:16:18.293 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:18.293 } 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.293 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:16:18.552 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:16:18.552 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:16:18.552 15:52:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:18.812 [2024-07-12 15:52:39.048911] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:18.812 [2024-07-12 15:52:39.048937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:18.812 [2024-07-12 15:52:39.048948] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2bd80 00:16:18.812 [2024-07-12 15:52:39.048955] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:18.812 [2024-07-12 15:52:39.050203] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:18.812 [2024-07-12 15:52:39.050222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:18.812 [2024-07-12 15:52:39.050267] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:18.812 [2024-07-12 15:52:39.050284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:18.812 pt1 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.812 "name": "raid_bdev1", 00:16:18.812 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:18.812 "strip_size_kb": 64, 00:16:18.812 "state": "configuring", 00:16:18.812 "raid_level": "concat", 00:16:18.812 "superblock": true, 00:16:18.812 "num_base_bdevs": 3, 00:16:18.812 "num_base_bdevs_discovered": 1, 00:16:18.812 "num_base_bdevs_operational": 3, 00:16:18.812 "base_bdevs_list": [ 00:16:18.812 { 00:16:18.812 "name": "pt1", 00:16:18.812 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:18.812 "is_configured": true, 00:16:18.812 "data_offset": 2048, 00:16:18.812 "data_size": 63488 00:16:18.812 }, 00:16:18.812 { 00:16:18.812 "name": null, 00:16:18.812 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:18.812 "is_configured": false, 00:16:18.812 "data_offset": 2048, 00:16:18.812 "data_size": 63488 00:16:18.812 }, 00:16:18.812 { 00:16:18.812 "name": null, 00:16:18.812 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:18.812 "is_configured": false, 00:16:18.812 "data_offset": 2048, 00:16:18.812 "data_size": 63488 00:16:18.812 } 00:16:18.812 ] 00:16:18.812 }' 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.812 15:52:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.381 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:16:19.381 15:52:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:19.951 [2024-07-12 15:52:40.320259] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:19.951 [2024-07-12 15:52:40.320296] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:19.951 [2024-07-12 15:52:40.320311] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f290f0 00:16:19.951 [2024-07-12 15:52:40.320317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:19.951 [2024-07-12 15:52:40.320579] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:19.951 [2024-07-12 15:52:40.320589] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:19.951 [2024-07-12 15:52:40.320633] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:19.951 [2024-07-12 15:52:40.320645] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:19.951 pt2 00:16:19.951 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:20.210 [2024-07-12 15:52:40.528791] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.210 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:20.469 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.469 "name": "raid_bdev1", 00:16:20.469 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:20.469 "strip_size_kb": 64, 00:16:20.469 "state": "configuring", 00:16:20.469 "raid_level": "concat", 00:16:20.469 "superblock": true, 00:16:20.469 "num_base_bdevs": 3, 00:16:20.469 "num_base_bdevs_discovered": 1, 00:16:20.469 "num_base_bdevs_operational": 3, 00:16:20.469 "base_bdevs_list": [ 00:16:20.469 { 00:16:20.469 "name": "pt1", 00:16:20.469 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:20.469 "is_configured": true, 00:16:20.469 "data_offset": 2048, 00:16:20.469 "data_size": 63488 00:16:20.469 }, 00:16:20.469 { 00:16:20.469 "name": null, 00:16:20.469 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:20.469 "is_configured": false, 00:16:20.469 "data_offset": 2048, 00:16:20.469 "data_size": 63488 00:16:20.469 }, 00:16:20.469 { 00:16:20.469 "name": null, 00:16:20.469 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:20.469 "is_configured": false, 00:16:20.469 "data_offset": 2048, 00:16:20.469 "data_size": 63488 00:16:20.469 } 00:16:20.469 ] 00:16:20.469 }' 00:16:20.469 15:52:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.469 15:52:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.038 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:16:21.038 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:21.038 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:21.038 [2024-07-12 15:52:41.439100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:21.038 [2024-07-12 15:52:41.439131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.038 [2024-07-12 15:52:41.439141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f272e0 00:16:21.038 [2024-07-12 15:52:41.439148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.038 [2024-07-12 15:52:41.439404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.038 [2024-07-12 15:52:41.439414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:21.039 [2024-07-12 15:52:41.439456] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:21.039 [2024-07-12 15:52:41.439467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:21.039 pt2 00:16:21.039 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:21.039 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:21.039 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:21.298 [2024-07-12 15:52:41.631585] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:21.298 [2024-07-12 15:52:41.631607] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:21.298 [2024-07-12 15:52:41.631615] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d7a000 00:16:21.298 [2024-07-12 15:52:41.631621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:21.298 [2024-07-12 15:52:41.631841] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:21.298 [2024-07-12 15:52:41.631851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:21.298 [2024-07-12 15:52:41.631883] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:21.298 [2024-07-12 15:52:41.631893] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:21.299 [2024-07-12 15:52:41.631970] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f27d30 00:16:21.299 [2024-07-12 15:52:41.631976] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:21.299 [2024-07-12 15:52:41.632107] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d78d70 00:16:21.299 [2024-07-12 15:52:41.632208] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f27d30 00:16:21.299 [2024-07-12 15:52:41.632213] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f27d30 00:16:21.299 [2024-07-12 15:52:41.632282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:21.299 pt3 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.299 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:21.558 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:21.558 "name": "raid_bdev1", 00:16:21.558 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:21.558 "strip_size_kb": 64, 00:16:21.558 "state": "online", 00:16:21.558 "raid_level": "concat", 00:16:21.558 "superblock": true, 00:16:21.558 "num_base_bdevs": 3, 00:16:21.558 "num_base_bdevs_discovered": 3, 00:16:21.558 "num_base_bdevs_operational": 3, 00:16:21.558 "base_bdevs_list": [ 00:16:21.558 { 00:16:21.558 "name": "pt1", 00:16:21.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:21.558 "is_configured": true, 00:16:21.558 "data_offset": 2048, 00:16:21.558 "data_size": 63488 00:16:21.559 }, 00:16:21.559 { 00:16:21.559 "name": "pt2", 00:16:21.559 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:21.559 "is_configured": true, 00:16:21.559 "data_offset": 2048, 00:16:21.559 "data_size": 63488 00:16:21.559 }, 00:16:21.559 { 00:16:21.559 "name": "pt3", 00:16:21.559 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:21.559 "is_configured": true, 00:16:21.559 "data_offset": 2048, 00:16:21.559 "data_size": 63488 00:16:21.559 } 00:16:21.559 ] 00:16:21.559 }' 00:16:21.559 15:52:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:21.559 15:52:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:22.128 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:22.128 [2024-07-12 15:52:42.574190] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:22.389 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:22.389 "name": "raid_bdev1", 00:16:22.389 "aliases": [ 00:16:22.389 "219e7755-d64c-41c0-a6c3-5227877df989" 00:16:22.389 ], 00:16:22.389 "product_name": "Raid Volume", 00:16:22.389 "block_size": 512, 00:16:22.389 "num_blocks": 190464, 00:16:22.389 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:22.389 "assigned_rate_limits": { 00:16:22.390 "rw_ios_per_sec": 0, 00:16:22.390 "rw_mbytes_per_sec": 0, 00:16:22.390 "r_mbytes_per_sec": 0, 00:16:22.390 "w_mbytes_per_sec": 0 00:16:22.390 }, 00:16:22.390 "claimed": false, 00:16:22.390 "zoned": false, 00:16:22.390 "supported_io_types": { 00:16:22.390 "read": true, 00:16:22.390 "write": true, 00:16:22.390 "unmap": true, 00:16:22.390 "flush": true, 00:16:22.390 "reset": true, 00:16:22.390 "nvme_admin": false, 00:16:22.390 "nvme_io": false, 00:16:22.390 "nvme_io_md": false, 00:16:22.390 "write_zeroes": true, 00:16:22.390 "zcopy": false, 00:16:22.390 "get_zone_info": false, 00:16:22.390 "zone_management": false, 00:16:22.390 "zone_append": false, 00:16:22.390 "compare": false, 00:16:22.390 "compare_and_write": false, 00:16:22.390 "abort": false, 00:16:22.390 "seek_hole": false, 00:16:22.390 "seek_data": false, 00:16:22.390 "copy": false, 00:16:22.390 "nvme_iov_md": false 00:16:22.390 }, 00:16:22.390 "memory_domains": [ 00:16:22.390 { 00:16:22.390 "dma_device_id": "system", 00:16:22.390 "dma_device_type": 1 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.390 "dma_device_type": 2 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "system", 00:16:22.390 "dma_device_type": 1 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.390 "dma_device_type": 2 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "system", 00:16:22.390 "dma_device_type": 1 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.390 "dma_device_type": 2 00:16:22.390 } 00:16:22.390 ], 00:16:22.390 "driver_specific": { 00:16:22.390 "raid": { 00:16:22.390 "uuid": "219e7755-d64c-41c0-a6c3-5227877df989", 00:16:22.390 "strip_size_kb": 64, 00:16:22.390 "state": "online", 00:16:22.390 "raid_level": "concat", 00:16:22.390 "superblock": true, 00:16:22.390 "num_base_bdevs": 3, 00:16:22.390 "num_base_bdevs_discovered": 3, 00:16:22.390 "num_base_bdevs_operational": 3, 00:16:22.390 "base_bdevs_list": [ 00:16:22.390 { 00:16:22.390 "name": "pt1", 00:16:22.390 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:22.390 "is_configured": true, 00:16:22.390 "data_offset": 2048, 00:16:22.390 "data_size": 63488 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "name": "pt2", 00:16:22.390 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:22.390 "is_configured": true, 00:16:22.390 "data_offset": 2048, 00:16:22.390 "data_size": 63488 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "name": "pt3", 00:16:22.390 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:22.390 "is_configured": true, 00:16:22.390 "data_offset": 2048, 00:16:22.390 "data_size": 63488 00:16:22.390 } 00:16:22.390 ] 00:16:22.390 } 00:16:22.390 } 00:16:22.390 }' 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:22.390 pt2 00:16:22.390 pt3' 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:22.390 "name": "pt1", 00:16:22.390 "aliases": [ 00:16:22.390 "00000000-0000-0000-0000-000000000001" 00:16:22.390 ], 00:16:22.390 "product_name": "passthru", 00:16:22.390 "block_size": 512, 00:16:22.390 "num_blocks": 65536, 00:16:22.390 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:22.390 "assigned_rate_limits": { 00:16:22.390 "rw_ios_per_sec": 0, 00:16:22.390 "rw_mbytes_per_sec": 0, 00:16:22.390 "r_mbytes_per_sec": 0, 00:16:22.390 "w_mbytes_per_sec": 0 00:16:22.390 }, 00:16:22.390 "claimed": true, 00:16:22.390 "claim_type": "exclusive_write", 00:16:22.390 "zoned": false, 00:16:22.390 "supported_io_types": { 00:16:22.390 "read": true, 00:16:22.390 "write": true, 00:16:22.390 "unmap": true, 00:16:22.390 "flush": true, 00:16:22.390 "reset": true, 00:16:22.390 "nvme_admin": false, 00:16:22.390 "nvme_io": false, 00:16:22.390 "nvme_io_md": false, 00:16:22.390 "write_zeroes": true, 00:16:22.390 "zcopy": true, 00:16:22.390 "get_zone_info": false, 00:16:22.390 "zone_management": false, 00:16:22.390 "zone_append": false, 00:16:22.390 "compare": false, 00:16:22.390 "compare_and_write": false, 00:16:22.390 "abort": true, 00:16:22.390 "seek_hole": false, 00:16:22.390 "seek_data": false, 00:16:22.390 "copy": true, 00:16:22.390 "nvme_iov_md": false 00:16:22.390 }, 00:16:22.390 "memory_domains": [ 00:16:22.390 { 00:16:22.390 "dma_device_id": "system", 00:16:22.390 "dma_device_type": 1 00:16:22.390 }, 00:16:22.390 { 00:16:22.390 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.390 "dma_device_type": 2 00:16:22.390 } 00:16:22.390 ], 00:16:22.390 "driver_specific": { 00:16:22.390 "passthru": { 00:16:22.390 "name": "pt1", 00:16:22.390 "base_bdev_name": "malloc1" 00:16:22.390 } 00:16:22.390 } 00:16:22.390 }' 00:16:22.390 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.650 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:22.650 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:22.650 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.650 15:52:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:22.650 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:22.650 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.650 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:22.911 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.171 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.171 "name": "pt2", 00:16:23.171 "aliases": [ 00:16:23.172 "00000000-0000-0000-0000-000000000002" 00:16:23.172 ], 00:16:23.172 "product_name": "passthru", 00:16:23.172 "block_size": 512, 00:16:23.172 "num_blocks": 65536, 00:16:23.172 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:23.172 "assigned_rate_limits": { 00:16:23.172 "rw_ios_per_sec": 0, 00:16:23.172 "rw_mbytes_per_sec": 0, 00:16:23.172 "r_mbytes_per_sec": 0, 00:16:23.172 "w_mbytes_per_sec": 0 00:16:23.172 }, 00:16:23.172 "claimed": true, 00:16:23.172 "claim_type": "exclusive_write", 00:16:23.172 "zoned": false, 00:16:23.172 "supported_io_types": { 00:16:23.172 "read": true, 00:16:23.172 "write": true, 00:16:23.172 "unmap": true, 00:16:23.172 "flush": true, 00:16:23.172 "reset": true, 00:16:23.172 "nvme_admin": false, 00:16:23.172 "nvme_io": false, 00:16:23.172 "nvme_io_md": false, 00:16:23.172 "write_zeroes": true, 00:16:23.172 "zcopy": true, 00:16:23.172 "get_zone_info": false, 00:16:23.172 "zone_management": false, 00:16:23.172 "zone_append": false, 00:16:23.172 "compare": false, 00:16:23.172 "compare_and_write": false, 00:16:23.172 "abort": true, 00:16:23.172 "seek_hole": false, 00:16:23.172 "seek_data": false, 00:16:23.172 "copy": true, 00:16:23.172 "nvme_iov_md": false 00:16:23.172 }, 00:16:23.172 "memory_domains": [ 00:16:23.172 { 00:16:23.172 "dma_device_id": "system", 00:16:23.172 "dma_device_type": 1 00:16:23.172 }, 00:16:23.172 { 00:16:23.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.172 "dma_device_type": 2 00:16:23.172 } 00:16:23.172 ], 00:16:23.172 "driver_specific": { 00:16:23.172 "passthru": { 00:16:23.172 "name": "pt2", 00:16:23.172 "base_bdev_name": "malloc2" 00:16:23.172 } 00:16:23.172 } 00:16:23.172 }' 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.172 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:23.432 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:23.692 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:23.692 "name": "pt3", 00:16:23.692 "aliases": [ 00:16:23.692 "00000000-0000-0000-0000-000000000003" 00:16:23.692 ], 00:16:23.692 "product_name": "passthru", 00:16:23.692 "block_size": 512, 00:16:23.692 "num_blocks": 65536, 00:16:23.692 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:23.692 "assigned_rate_limits": { 00:16:23.692 "rw_ios_per_sec": 0, 00:16:23.692 "rw_mbytes_per_sec": 0, 00:16:23.692 "r_mbytes_per_sec": 0, 00:16:23.692 "w_mbytes_per_sec": 0 00:16:23.692 }, 00:16:23.692 "claimed": true, 00:16:23.692 "claim_type": "exclusive_write", 00:16:23.692 "zoned": false, 00:16:23.692 "supported_io_types": { 00:16:23.692 "read": true, 00:16:23.692 "write": true, 00:16:23.692 "unmap": true, 00:16:23.692 "flush": true, 00:16:23.692 "reset": true, 00:16:23.692 "nvme_admin": false, 00:16:23.692 "nvme_io": false, 00:16:23.692 "nvme_io_md": false, 00:16:23.692 "write_zeroes": true, 00:16:23.692 "zcopy": true, 00:16:23.692 "get_zone_info": false, 00:16:23.692 "zone_management": false, 00:16:23.692 "zone_append": false, 00:16:23.692 "compare": false, 00:16:23.692 "compare_and_write": false, 00:16:23.692 "abort": true, 00:16:23.692 "seek_hole": false, 00:16:23.692 "seek_data": false, 00:16:23.692 "copy": true, 00:16:23.692 "nvme_iov_md": false 00:16:23.692 }, 00:16:23.692 "memory_domains": [ 00:16:23.692 { 00:16:23.692 "dma_device_id": "system", 00:16:23.692 "dma_device_type": 1 00:16:23.692 }, 00:16:23.692 { 00:16:23.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:23.692 "dma_device_type": 2 00:16:23.692 } 00:16:23.692 ], 00:16:23.692 "driver_specific": { 00:16:23.692 "passthru": { 00:16:23.692 "name": "pt3", 00:16:23.692 "base_bdev_name": "malloc3" 00:16:23.692 } 00:16:23.693 } 00:16:23.693 }' 00:16:23.693 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.693 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:23.693 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:23.693 15:52:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.693 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:23.693 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:23.693 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.693 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:23.952 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:16:23.952 [2024-07-12 15:52:44.382764] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 219e7755-d64c-41c0-a6c3-5227877df989 '!=' 219e7755-d64c-41c0-a6c3-5227877df989 ']' 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2547419 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2547419 ']' 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2547419 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2547419 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2547419' 00:16:24.253 killing process with pid 2547419 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2547419 00:16:24.253 [2024-07-12 15:52:44.456643] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:24.253 [2024-07-12 15:52:44.456681] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:24.253 [2024-07-12 15:52:44.456731] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:24.253 [2024-07-12 15:52:44.456738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f27d30 name raid_bdev1, state offline 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2547419 00:16:24.253 [2024-07-12 15:52:44.471730] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:24.253 00:16:24.253 real 0m13.603s 00:16:24.253 user 0m25.581s 00:16:24.253 sys 0m1.961s 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:24.253 15:52:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.253 ************************************ 00:16:24.253 END TEST raid_superblock_test 00:16:24.253 ************************************ 00:16:24.253 15:52:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:24.253 15:52:44 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:16:24.253 15:52:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:24.253 15:52:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:24.253 15:52:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:24.253 ************************************ 00:16:24.253 START TEST raid_read_error_test 00:16:24.253 ************************************ 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:24.253 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.44BKlAXWZA 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2549879 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2549879 /var/tmp/spdk-raid.sock 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2549879 ']' 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:24.254 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:24.254 15:52:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.529 [2024-07-12 15:52:44.740920] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:16:24.529 [2024-07-12 15:52:44.740975] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2549879 ] 00:16:24.529 [2024-07-12 15:52:44.849837] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:24.529 [2024-07-12 15:52:44.918398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:24.529 [2024-07-12 15:52:44.958871] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:24.529 [2024-07-12 15:52:44.958894] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:25.468 15:52:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:25.468 15:52:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:25.468 15:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:25.468 15:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:25.468 BaseBdev1_malloc 00:16:25.468 15:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:25.727 true 00:16:25.727 15:52:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:25.727 [2024-07-12 15:52:46.121423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:25.727 [2024-07-12 15:52:46.121457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:25.727 [2024-07-12 15:52:46.121468] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e5aa0 00:16:25.727 [2024-07-12 15:52:46.121474] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:25.727 [2024-07-12 15:52:46.122724] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:25.727 [2024-07-12 15:52:46.122745] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:25.727 BaseBdev1 00:16:25.727 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:25.727 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:25.987 BaseBdev2_malloc 00:16:25.987 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:26.247 true 00:16:26.247 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:26.247 [2024-07-12 15:52:46.680630] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:26.247 [2024-07-12 15:52:46.680658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:26.247 [2024-07-12 15:52:46.680670] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15eae40 00:16:26.247 [2024-07-12 15:52:46.680678] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:26.247 [2024-07-12 15:52:46.681864] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:26.247 [2024-07-12 15:52:46.681884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:26.247 BaseBdev2 00:16:26.507 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:26.507 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:26.507 BaseBdev3_malloc 00:16:26.507 15:52:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:26.767 true 00:16:26.767 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:27.027 [2024-07-12 15:52:47.243890] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:27.027 [2024-07-12 15:52:47.243917] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:27.027 [2024-07-12 15:52:47.243930] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15ec7f0 00:16:27.027 [2024-07-12 15:52:47.243937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:27.027 [2024-07-12 15:52:47.245127] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:27.027 [2024-07-12 15:52:47.245145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:27.027 BaseBdev3 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:27.027 [2024-07-12 15:52:47.432389] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:27.027 [2024-07-12 15:52:47.433399] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:27.027 [2024-07-12 15:52:47.433453] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:27.027 [2024-07-12 15:52:47.433608] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15ea750 00:16:27.027 [2024-07-12 15:52:47.433619] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:27.027 [2024-07-12 15:52:47.433773] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ed970 00:16:27.027 [2024-07-12 15:52:47.433888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15ea750 00:16:27.027 [2024-07-12 15:52:47.433894] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15ea750 00:16:27.027 [2024-07-12 15:52:47.433968] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.027 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:27.286 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.286 "name": "raid_bdev1", 00:16:27.286 "uuid": "144cc1d8-6269-41d7-9e7f-6ee5ca4a31f0", 00:16:27.286 "strip_size_kb": 64, 00:16:27.286 "state": "online", 00:16:27.286 "raid_level": "concat", 00:16:27.286 "superblock": true, 00:16:27.286 "num_base_bdevs": 3, 00:16:27.286 "num_base_bdevs_discovered": 3, 00:16:27.286 "num_base_bdevs_operational": 3, 00:16:27.286 "base_bdevs_list": [ 00:16:27.286 { 00:16:27.286 "name": "BaseBdev1", 00:16:27.286 "uuid": "01538f68-ee39-57c3-b302-cf8d1a73454f", 00:16:27.286 "is_configured": true, 00:16:27.286 "data_offset": 2048, 00:16:27.286 "data_size": 63488 00:16:27.286 }, 00:16:27.286 { 00:16:27.286 "name": "BaseBdev2", 00:16:27.286 "uuid": "6bfef3dd-dd91-5344-8b06-5fd9a4a486f6", 00:16:27.286 "is_configured": true, 00:16:27.286 "data_offset": 2048, 00:16:27.286 "data_size": 63488 00:16:27.286 }, 00:16:27.286 { 00:16:27.286 "name": "BaseBdev3", 00:16:27.286 "uuid": "af2ae5c6-b109-5a65-8d34-812460ad26d2", 00:16:27.286 "is_configured": true, 00:16:27.286 "data_offset": 2048, 00:16:27.286 "data_size": 63488 00:16:27.286 } 00:16:27.286 ] 00:16:27.286 }' 00:16:27.286 15:52:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.286 15:52:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.854 15:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:27.854 15:52:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:27.854 [2024-07-12 15:52:48.270719] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15ed700 00:16:28.792 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.051 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:29.310 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.310 "name": "raid_bdev1", 00:16:29.310 "uuid": "144cc1d8-6269-41d7-9e7f-6ee5ca4a31f0", 00:16:29.310 "strip_size_kb": 64, 00:16:29.310 "state": "online", 00:16:29.310 "raid_level": "concat", 00:16:29.310 "superblock": true, 00:16:29.310 "num_base_bdevs": 3, 00:16:29.310 "num_base_bdevs_discovered": 3, 00:16:29.310 "num_base_bdevs_operational": 3, 00:16:29.310 "base_bdevs_list": [ 00:16:29.310 { 00:16:29.310 "name": "BaseBdev1", 00:16:29.310 "uuid": "01538f68-ee39-57c3-b302-cf8d1a73454f", 00:16:29.310 "is_configured": true, 00:16:29.310 "data_offset": 2048, 00:16:29.310 "data_size": 63488 00:16:29.310 }, 00:16:29.310 { 00:16:29.310 "name": "BaseBdev2", 00:16:29.310 "uuid": "6bfef3dd-dd91-5344-8b06-5fd9a4a486f6", 00:16:29.310 "is_configured": true, 00:16:29.310 "data_offset": 2048, 00:16:29.310 "data_size": 63488 00:16:29.310 }, 00:16:29.310 { 00:16:29.310 "name": "BaseBdev3", 00:16:29.310 "uuid": "af2ae5c6-b109-5a65-8d34-812460ad26d2", 00:16:29.310 "is_configured": true, 00:16:29.310 "data_offset": 2048, 00:16:29.310 "data_size": 63488 00:16:29.310 } 00:16:29.310 ] 00:16:29.310 }' 00:16:29.310 15:52:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.310 15:52:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.878 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:29.878 [2024-07-12 15:52:50.322668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:29.878 [2024-07-12 15:52:50.322695] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:29.878 [2024-07-12 15:52:50.325287] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:29.878 [2024-07-12 15:52:50.325312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:29.878 [2024-07-12 15:52:50.325338] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:29.878 [2024-07-12 15:52:50.325344] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15ea750 name raid_bdev1, state offline 00:16:30.137 0 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2549879 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2549879 ']' 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2549879 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2549879 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2549879' 00:16:30.137 killing process with pid 2549879 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2549879 00:16:30.137 [2024-07-12 15:52:50.387988] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2549879 00:16:30.137 [2024-07-12 15:52:50.399111] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.44BKlAXWZA 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:16:30.137 00:16:30.137 real 0m5.859s 00:16:30.137 user 0m9.295s 00:16:30.137 sys 0m0.841s 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:30.137 15:52:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.137 ************************************ 00:16:30.137 END TEST raid_read_error_test 00:16:30.137 ************************************ 00:16:30.137 15:52:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:30.137 15:52:50 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:16:30.137 15:52:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:30.137 15:52:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:30.137 15:52:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:30.397 ************************************ 00:16:30.397 START TEST raid_write_error_test 00:16:30.397 ************************************ 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.fnwbsBhM8v 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2551063 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2551063 /var/tmp/spdk-raid.sock 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2551063 ']' 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:30.397 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:30.397 15:52:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.397 [2024-07-12 15:52:50.687168] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:16:30.397 [2024-07-12 15:52:50.687229] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2551063 ] 00:16:30.397 [2024-07-12 15:52:50.777073] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:30.656 [2024-07-12 15:52:50.844663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:30.656 [2024-07-12 15:52:50.888717] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:30.656 [2024-07-12 15:52:50.888741] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:31.224 15:52:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:31.224 15:52:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:31.224 15:52:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:31.224 15:52:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:31.482 BaseBdev1_malloc 00:16:31.482 15:52:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:31.482 true 00:16:31.482 15:52:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:31.742 [2024-07-12 15:52:52.059241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:31.742 [2024-07-12 15:52:52.059272] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:31.742 [2024-07-12 15:52:52.059285] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfdeaa0 00:16:31.742 [2024-07-12 15:52:52.059291] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:31.742 [2024-07-12 15:52:52.060537] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:31.742 [2024-07-12 15:52:52.060557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:31.742 BaseBdev1 00:16:31.742 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:31.742 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:32.001 BaseBdev2_malloc 00:16:32.001 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:32.001 true 00:16:32.260 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:32.260 [2024-07-12 15:52:52.630612] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:32.260 [2024-07-12 15:52:52.630641] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:32.260 [2024-07-12 15:52:52.630652] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe3e40 00:16:32.260 [2024-07-12 15:52:52.630659] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:32.260 [2024-07-12 15:52:52.631865] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:32.260 [2024-07-12 15:52:52.631885] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:32.260 BaseBdev2 00:16:32.260 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:32.260 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:32.519 BaseBdev3_malloc 00:16:32.519 15:52:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:32.777 true 00:16:32.777 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:32.777 [2024-07-12 15:52:53.201988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:32.777 [2024-07-12 15:52:53.202016] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:32.777 [2024-07-12 15:52:53.202029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfe57f0 00:16:32.777 [2024-07-12 15:52:53.202035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:32.777 [2024-07-12 15:52:53.203226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:32.777 [2024-07-12 15:52:53.203246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:32.777 BaseBdev3 00:16:32.777 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:33.035 [2024-07-12 15:52:53.390495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:33.035 [2024-07-12 15:52:53.391493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:33.036 [2024-07-12 15:52:53.391546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:33.036 [2024-07-12 15:52:53.391698] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfe3750 00:16:33.036 [2024-07-12 15:52:53.391705] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:33.036 [2024-07-12 15:52:53.391866] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfe6970 00:16:33.036 [2024-07-12 15:52:53.391981] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfe3750 00:16:33.036 [2024-07-12 15:52:53.391987] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfe3750 00:16:33.036 [2024-07-12 15:52:53.392061] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.036 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:33.294 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.294 "name": "raid_bdev1", 00:16:33.294 "uuid": "bd5d7995-2474-4de2-b4c2-3c38cd190b8b", 00:16:33.294 "strip_size_kb": 64, 00:16:33.294 "state": "online", 00:16:33.294 "raid_level": "concat", 00:16:33.294 "superblock": true, 00:16:33.294 "num_base_bdevs": 3, 00:16:33.294 "num_base_bdevs_discovered": 3, 00:16:33.294 "num_base_bdevs_operational": 3, 00:16:33.294 "base_bdevs_list": [ 00:16:33.294 { 00:16:33.294 "name": "BaseBdev1", 00:16:33.294 "uuid": "8c639fd6-d050-5486-8163-55722739f031", 00:16:33.294 "is_configured": true, 00:16:33.294 "data_offset": 2048, 00:16:33.294 "data_size": 63488 00:16:33.294 }, 00:16:33.294 { 00:16:33.294 "name": "BaseBdev2", 00:16:33.294 "uuid": "999d9700-50b2-5276-b503-7be4bcaccfa6", 00:16:33.294 "is_configured": true, 00:16:33.294 "data_offset": 2048, 00:16:33.294 "data_size": 63488 00:16:33.294 }, 00:16:33.294 { 00:16:33.294 "name": "BaseBdev3", 00:16:33.294 "uuid": "9892357c-a827-5de8-ac06-9ecbb29a18d1", 00:16:33.294 "is_configured": true, 00:16:33.294 "data_offset": 2048, 00:16:33.294 "data_size": 63488 00:16:33.294 } 00:16:33.294 ] 00:16:33.294 }' 00:16:33.294 15:52:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.294 15:52:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.863 15:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:33.863 15:52:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:33.863 [2024-07-12 15:52:54.212783] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfe6700 00:16:34.803 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.062 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:35.322 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.322 "name": "raid_bdev1", 00:16:35.322 "uuid": "bd5d7995-2474-4de2-b4c2-3c38cd190b8b", 00:16:35.322 "strip_size_kb": 64, 00:16:35.322 "state": "online", 00:16:35.322 "raid_level": "concat", 00:16:35.322 "superblock": true, 00:16:35.322 "num_base_bdevs": 3, 00:16:35.322 "num_base_bdevs_discovered": 3, 00:16:35.322 "num_base_bdevs_operational": 3, 00:16:35.322 "base_bdevs_list": [ 00:16:35.322 { 00:16:35.322 "name": "BaseBdev1", 00:16:35.322 "uuid": "8c639fd6-d050-5486-8163-55722739f031", 00:16:35.322 "is_configured": true, 00:16:35.322 "data_offset": 2048, 00:16:35.322 "data_size": 63488 00:16:35.322 }, 00:16:35.322 { 00:16:35.322 "name": "BaseBdev2", 00:16:35.322 "uuid": "999d9700-50b2-5276-b503-7be4bcaccfa6", 00:16:35.322 "is_configured": true, 00:16:35.322 "data_offset": 2048, 00:16:35.322 "data_size": 63488 00:16:35.322 }, 00:16:35.322 { 00:16:35.322 "name": "BaseBdev3", 00:16:35.322 "uuid": "9892357c-a827-5de8-ac06-9ecbb29a18d1", 00:16:35.322 "is_configured": true, 00:16:35.322 "data_offset": 2048, 00:16:35.322 "data_size": 63488 00:16:35.322 } 00:16:35.322 ] 00:16:35.322 }' 00:16:35.322 15:52:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.322 15:52:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:35.891 [2024-07-12 15:52:56.244495] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:35.891 [2024-07-12 15:52:56.244524] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:35.891 [2024-07-12 15:52:56.247112] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:35.891 [2024-07-12 15:52:56.247137] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:35.891 [2024-07-12 15:52:56.247163] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:35.891 [2024-07-12 15:52:56.247169] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfe3750 name raid_bdev1, state offline 00:16:35.891 0 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2551063 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2551063 ']' 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2551063 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2551063 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2551063' 00:16:35.891 killing process with pid 2551063 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2551063 00:16:35.891 [2024-07-12 15:52:56.314429] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:35.891 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2551063 00:16:35.891 [2024-07-12 15:52:56.325685] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.fnwbsBhM8v 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:16:36.152 00:16:36.152 real 0m5.852s 00:16:36.152 user 0m9.331s 00:16:36.152 sys 0m0.816s 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:36.152 15:52:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.152 ************************************ 00:16:36.152 END TEST raid_write_error_test 00:16:36.152 ************************************ 00:16:36.152 15:52:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:36.152 15:52:56 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:36.152 15:52:56 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:16:36.152 15:52:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:36.152 15:52:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:36.152 15:52:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:36.152 ************************************ 00:16:36.152 START TEST raid_state_function_test 00:16:36.152 ************************************ 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2552185 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2552185' 00:16:36.152 Process raid pid: 2552185 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2552185 /var/tmp/spdk-raid.sock 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2552185 ']' 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:36.152 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:36.152 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:36.412 [2024-07-12 15:52:56.603695] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:16:36.412 [2024-07-12 15:52:56.603762] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:36.412 [2024-07-12 15:52:56.693586] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:36.412 [2024-07-12 15:52:56.761886] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:36.412 [2024-07-12 15:52:56.800680] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:36.412 [2024-07-12 15:52:56.800701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:37.350 [2024-07-12 15:52:57.619623] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:37.350 [2024-07-12 15:52:57.619651] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:37.350 [2024-07-12 15:52:57.619657] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:37.350 [2024-07-12 15:52:57.619662] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:37.350 [2024-07-12 15:52:57.619667] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:37.350 [2024-07-12 15:52:57.619673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.350 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.610 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.610 "name": "Existed_Raid", 00:16:37.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.610 "strip_size_kb": 0, 00:16:37.610 "state": "configuring", 00:16:37.610 "raid_level": "raid1", 00:16:37.611 "superblock": false, 00:16:37.611 "num_base_bdevs": 3, 00:16:37.611 "num_base_bdevs_discovered": 0, 00:16:37.611 "num_base_bdevs_operational": 3, 00:16:37.611 "base_bdevs_list": [ 00:16:37.611 { 00:16:37.611 "name": "BaseBdev1", 00:16:37.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.611 "is_configured": false, 00:16:37.611 "data_offset": 0, 00:16:37.611 "data_size": 0 00:16:37.611 }, 00:16:37.611 { 00:16:37.611 "name": "BaseBdev2", 00:16:37.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.611 "is_configured": false, 00:16:37.611 "data_offset": 0, 00:16:37.611 "data_size": 0 00:16:37.611 }, 00:16:37.611 { 00:16:37.611 "name": "BaseBdev3", 00:16:37.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.611 "is_configured": false, 00:16:37.611 "data_offset": 0, 00:16:37.611 "data_size": 0 00:16:37.611 } 00:16:37.611 ] 00:16:37.611 }' 00:16:37.611 15:52:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.611 15:52:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.179 15:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:38.179 [2024-07-12 15:52:58.541856] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:38.179 [2024-07-12 15:52:58.541872] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b20900 name Existed_Raid, state configuring 00:16:38.179 15:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:38.439 [2024-07-12 15:52:58.718312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:38.439 [2024-07-12 15:52:58.718328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:38.439 [2024-07-12 15:52:58.718334] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:38.439 [2024-07-12 15:52:58.718339] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:38.439 [2024-07-12 15:52:58.718344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:38.439 [2024-07-12 15:52:58.718349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:38.439 15:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:38.698 [2024-07-12 15:52:58.901470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:38.698 BaseBdev1 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:38.698 15:52:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.698 15:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:38.958 [ 00:16:38.958 { 00:16:38.958 "name": "BaseBdev1", 00:16:38.958 "aliases": [ 00:16:38.958 "1b630d8c-a4c2-4d02-88bb-13154ca874fe" 00:16:38.958 ], 00:16:38.958 "product_name": "Malloc disk", 00:16:38.958 "block_size": 512, 00:16:38.958 "num_blocks": 65536, 00:16:38.958 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:38.958 "assigned_rate_limits": { 00:16:38.958 "rw_ios_per_sec": 0, 00:16:38.958 "rw_mbytes_per_sec": 0, 00:16:38.958 "r_mbytes_per_sec": 0, 00:16:38.958 "w_mbytes_per_sec": 0 00:16:38.958 }, 00:16:38.958 "claimed": true, 00:16:38.958 "claim_type": "exclusive_write", 00:16:38.958 "zoned": false, 00:16:38.958 "supported_io_types": { 00:16:38.958 "read": true, 00:16:38.958 "write": true, 00:16:38.958 "unmap": true, 00:16:38.958 "flush": true, 00:16:38.958 "reset": true, 00:16:38.958 "nvme_admin": false, 00:16:38.958 "nvme_io": false, 00:16:38.958 "nvme_io_md": false, 00:16:38.958 "write_zeroes": true, 00:16:38.958 "zcopy": true, 00:16:38.958 "get_zone_info": false, 00:16:38.958 "zone_management": false, 00:16:38.958 "zone_append": false, 00:16:38.958 "compare": false, 00:16:38.958 "compare_and_write": false, 00:16:38.958 "abort": true, 00:16:38.958 "seek_hole": false, 00:16:38.958 "seek_data": false, 00:16:38.958 "copy": true, 00:16:38.958 "nvme_iov_md": false 00:16:38.958 }, 00:16:38.958 "memory_domains": [ 00:16:38.958 { 00:16:38.958 "dma_device_id": "system", 00:16:38.958 "dma_device_type": 1 00:16:38.958 }, 00:16:38.958 { 00:16:38.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.958 "dma_device_type": 2 00:16:38.958 } 00:16:38.958 ], 00:16:38.958 "driver_specific": {} 00:16:38.958 } 00:16:38.958 ] 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.958 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.219 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.219 "name": "Existed_Raid", 00:16:39.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.219 "strip_size_kb": 0, 00:16:39.219 "state": "configuring", 00:16:39.219 "raid_level": "raid1", 00:16:39.219 "superblock": false, 00:16:39.219 "num_base_bdevs": 3, 00:16:39.219 "num_base_bdevs_discovered": 1, 00:16:39.219 "num_base_bdevs_operational": 3, 00:16:39.219 "base_bdevs_list": [ 00:16:39.219 { 00:16:39.219 "name": "BaseBdev1", 00:16:39.219 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:39.219 "is_configured": true, 00:16:39.219 "data_offset": 0, 00:16:39.219 "data_size": 65536 00:16:39.219 }, 00:16:39.219 { 00:16:39.219 "name": "BaseBdev2", 00:16:39.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.219 "is_configured": false, 00:16:39.219 "data_offset": 0, 00:16:39.219 "data_size": 0 00:16:39.219 }, 00:16:39.219 { 00:16:39.219 "name": "BaseBdev3", 00:16:39.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.219 "is_configured": false, 00:16:39.219 "data_offset": 0, 00:16:39.219 "data_size": 0 00:16:39.219 } 00:16:39.219 ] 00:16:39.219 }' 00:16:39.219 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.219 15:52:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:39.787 15:52:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:39.787 [2024-07-12 15:53:00.172689] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:39.787 [2024-07-12 15:53:00.172724] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b20190 name Existed_Raid, state configuring 00:16:39.787 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:40.046 [2024-07-12 15:53:00.361185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:40.046 [2024-07-12 15:53:00.362277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:40.046 [2024-07-12 15:53:00.362301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:40.046 [2024-07-12 15:53:00.362307] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:40.046 [2024-07-12 15:53:00.362313] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.046 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.313 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.313 "name": "Existed_Raid", 00:16:40.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.313 "strip_size_kb": 0, 00:16:40.313 "state": "configuring", 00:16:40.313 "raid_level": "raid1", 00:16:40.313 "superblock": false, 00:16:40.313 "num_base_bdevs": 3, 00:16:40.313 "num_base_bdevs_discovered": 1, 00:16:40.313 "num_base_bdevs_operational": 3, 00:16:40.313 "base_bdevs_list": [ 00:16:40.313 { 00:16:40.313 "name": "BaseBdev1", 00:16:40.313 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:40.313 "is_configured": true, 00:16:40.313 "data_offset": 0, 00:16:40.313 "data_size": 65536 00:16:40.313 }, 00:16:40.313 { 00:16:40.313 "name": "BaseBdev2", 00:16:40.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.313 "is_configured": false, 00:16:40.313 "data_offset": 0, 00:16:40.313 "data_size": 0 00:16:40.313 }, 00:16:40.313 { 00:16:40.313 "name": "BaseBdev3", 00:16:40.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.313 "is_configured": false, 00:16:40.313 "data_offset": 0, 00:16:40.313 "data_size": 0 00:16:40.313 } 00:16:40.313 ] 00:16:40.313 }' 00:16:40.313 15:53:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.313 15:53:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:40.886 [2024-07-12 15:53:01.284548] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.886 BaseBdev2 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:40.886 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.160 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:41.423 [ 00:16:41.423 { 00:16:41.423 "name": "BaseBdev2", 00:16:41.423 "aliases": [ 00:16:41.423 "402bdb14-e529-448c-b0a7-2552ebddafc5" 00:16:41.423 ], 00:16:41.423 "product_name": "Malloc disk", 00:16:41.423 "block_size": 512, 00:16:41.423 "num_blocks": 65536, 00:16:41.423 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:41.423 "assigned_rate_limits": { 00:16:41.423 "rw_ios_per_sec": 0, 00:16:41.423 "rw_mbytes_per_sec": 0, 00:16:41.423 "r_mbytes_per_sec": 0, 00:16:41.423 "w_mbytes_per_sec": 0 00:16:41.423 }, 00:16:41.423 "claimed": true, 00:16:41.423 "claim_type": "exclusive_write", 00:16:41.423 "zoned": false, 00:16:41.423 "supported_io_types": { 00:16:41.423 "read": true, 00:16:41.423 "write": true, 00:16:41.423 "unmap": true, 00:16:41.423 "flush": true, 00:16:41.423 "reset": true, 00:16:41.423 "nvme_admin": false, 00:16:41.423 "nvme_io": false, 00:16:41.423 "nvme_io_md": false, 00:16:41.423 "write_zeroes": true, 00:16:41.423 "zcopy": true, 00:16:41.423 "get_zone_info": false, 00:16:41.423 "zone_management": false, 00:16:41.423 "zone_append": false, 00:16:41.423 "compare": false, 00:16:41.423 "compare_and_write": false, 00:16:41.423 "abort": true, 00:16:41.423 "seek_hole": false, 00:16:41.423 "seek_data": false, 00:16:41.423 "copy": true, 00:16:41.423 "nvme_iov_md": false 00:16:41.423 }, 00:16:41.423 "memory_domains": [ 00:16:41.423 { 00:16:41.423 "dma_device_id": "system", 00:16:41.423 "dma_device_type": 1 00:16:41.423 }, 00:16:41.423 { 00:16:41.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.423 "dma_device_type": 2 00:16:41.423 } 00:16:41.423 ], 00:16:41.423 "driver_specific": {} 00:16:41.423 } 00:16:41.423 ] 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.423 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.748 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.748 "name": "Existed_Raid", 00:16:41.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.748 "strip_size_kb": 0, 00:16:41.748 "state": "configuring", 00:16:41.748 "raid_level": "raid1", 00:16:41.748 "superblock": false, 00:16:41.748 "num_base_bdevs": 3, 00:16:41.748 "num_base_bdevs_discovered": 2, 00:16:41.748 "num_base_bdevs_operational": 3, 00:16:41.748 "base_bdevs_list": [ 00:16:41.748 { 00:16:41.748 "name": "BaseBdev1", 00:16:41.748 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:41.748 "is_configured": true, 00:16:41.748 "data_offset": 0, 00:16:41.748 "data_size": 65536 00:16:41.748 }, 00:16:41.748 { 00:16:41.748 "name": "BaseBdev2", 00:16:41.748 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:41.748 "is_configured": true, 00:16:41.748 "data_offset": 0, 00:16:41.748 "data_size": 65536 00:16:41.748 }, 00:16:41.748 { 00:16:41.748 "name": "BaseBdev3", 00:16:41.748 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.748 "is_configured": false, 00:16:41.748 "data_offset": 0, 00:16:41.748 "data_size": 0 00:16:41.748 } 00:16:41.748 ] 00:16:41.748 }' 00:16:41.748 15:53:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.748 15:53:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:42.008 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:42.269 [2024-07-12 15:53:02.580692] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:42.269 [2024-07-12 15:53:02.580727] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b21280 00:16:42.269 [2024-07-12 15:53:02.580732] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:42.269 [2024-07-12 15:53:02.580892] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b20d70 00:16:42.269 [2024-07-12 15:53:02.580985] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b21280 00:16:42.269 [2024-07-12 15:53:02.580991] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b21280 00:16:42.269 [2024-07-12 15:53:02.581109] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:42.269 BaseBdev3 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:42.269 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.529 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:42.529 [ 00:16:42.529 { 00:16:42.529 "name": "BaseBdev3", 00:16:42.529 "aliases": [ 00:16:42.529 "96a65bd6-a88a-4ea6-8161-2d76428361e4" 00:16:42.529 ], 00:16:42.530 "product_name": "Malloc disk", 00:16:42.530 "block_size": 512, 00:16:42.530 "num_blocks": 65536, 00:16:42.530 "uuid": "96a65bd6-a88a-4ea6-8161-2d76428361e4", 00:16:42.530 "assigned_rate_limits": { 00:16:42.530 "rw_ios_per_sec": 0, 00:16:42.530 "rw_mbytes_per_sec": 0, 00:16:42.530 "r_mbytes_per_sec": 0, 00:16:42.530 "w_mbytes_per_sec": 0 00:16:42.530 }, 00:16:42.530 "claimed": true, 00:16:42.530 "claim_type": "exclusive_write", 00:16:42.530 "zoned": false, 00:16:42.530 "supported_io_types": { 00:16:42.530 "read": true, 00:16:42.530 "write": true, 00:16:42.530 "unmap": true, 00:16:42.530 "flush": true, 00:16:42.530 "reset": true, 00:16:42.530 "nvme_admin": false, 00:16:42.530 "nvme_io": false, 00:16:42.530 "nvme_io_md": false, 00:16:42.530 "write_zeroes": true, 00:16:42.530 "zcopy": true, 00:16:42.530 "get_zone_info": false, 00:16:42.530 "zone_management": false, 00:16:42.530 "zone_append": false, 00:16:42.530 "compare": false, 00:16:42.530 "compare_and_write": false, 00:16:42.530 "abort": true, 00:16:42.530 "seek_hole": false, 00:16:42.530 "seek_data": false, 00:16:42.530 "copy": true, 00:16:42.530 "nvme_iov_md": false 00:16:42.530 }, 00:16:42.530 "memory_domains": [ 00:16:42.530 { 00:16:42.530 "dma_device_id": "system", 00:16:42.530 "dma_device_type": 1 00:16:42.530 }, 00:16:42.530 { 00:16:42.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.530 "dma_device_type": 2 00:16:42.530 } 00:16:42.530 ], 00:16:42.530 "driver_specific": {} 00:16:42.530 } 00:16:42.530 ] 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.530 15:53:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.788 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.788 "name": "Existed_Raid", 00:16:42.788 "uuid": "d6b29084-e281-4fd7-819c-519e4588663f", 00:16:42.788 "strip_size_kb": 0, 00:16:42.788 "state": "online", 00:16:42.788 "raid_level": "raid1", 00:16:42.788 "superblock": false, 00:16:42.788 "num_base_bdevs": 3, 00:16:42.788 "num_base_bdevs_discovered": 3, 00:16:42.788 "num_base_bdevs_operational": 3, 00:16:42.788 "base_bdevs_list": [ 00:16:42.788 { 00:16:42.788 "name": "BaseBdev1", 00:16:42.788 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:42.788 "is_configured": true, 00:16:42.788 "data_offset": 0, 00:16:42.788 "data_size": 65536 00:16:42.788 }, 00:16:42.788 { 00:16:42.788 "name": "BaseBdev2", 00:16:42.788 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:42.788 "is_configured": true, 00:16:42.788 "data_offset": 0, 00:16:42.788 "data_size": 65536 00:16:42.788 }, 00:16:42.788 { 00:16:42.788 "name": "BaseBdev3", 00:16:42.788 "uuid": "96a65bd6-a88a-4ea6-8161-2d76428361e4", 00:16:42.788 "is_configured": true, 00:16:42.788 "data_offset": 0, 00:16:42.788 "data_size": 65536 00:16:42.788 } 00:16:42.788 ] 00:16:42.788 }' 00:16:42.788 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.788 15:53:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:43.357 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:43.617 [2024-07-12 15:53:03.888256] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:43.617 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:43.617 "name": "Existed_Raid", 00:16:43.617 "aliases": [ 00:16:43.617 "d6b29084-e281-4fd7-819c-519e4588663f" 00:16:43.617 ], 00:16:43.617 "product_name": "Raid Volume", 00:16:43.617 "block_size": 512, 00:16:43.618 "num_blocks": 65536, 00:16:43.618 "uuid": "d6b29084-e281-4fd7-819c-519e4588663f", 00:16:43.618 "assigned_rate_limits": { 00:16:43.618 "rw_ios_per_sec": 0, 00:16:43.618 "rw_mbytes_per_sec": 0, 00:16:43.618 "r_mbytes_per_sec": 0, 00:16:43.618 "w_mbytes_per_sec": 0 00:16:43.618 }, 00:16:43.618 "claimed": false, 00:16:43.618 "zoned": false, 00:16:43.618 "supported_io_types": { 00:16:43.618 "read": true, 00:16:43.618 "write": true, 00:16:43.618 "unmap": false, 00:16:43.618 "flush": false, 00:16:43.618 "reset": true, 00:16:43.618 "nvme_admin": false, 00:16:43.618 "nvme_io": false, 00:16:43.618 "nvme_io_md": false, 00:16:43.618 "write_zeroes": true, 00:16:43.618 "zcopy": false, 00:16:43.618 "get_zone_info": false, 00:16:43.618 "zone_management": false, 00:16:43.618 "zone_append": false, 00:16:43.618 "compare": false, 00:16:43.618 "compare_and_write": false, 00:16:43.618 "abort": false, 00:16:43.618 "seek_hole": false, 00:16:43.618 "seek_data": false, 00:16:43.618 "copy": false, 00:16:43.618 "nvme_iov_md": false 00:16:43.618 }, 00:16:43.618 "memory_domains": [ 00:16:43.618 { 00:16:43.618 "dma_device_id": "system", 00:16:43.618 "dma_device_type": 1 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.618 "dma_device_type": 2 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "dma_device_id": "system", 00:16:43.618 "dma_device_type": 1 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.618 "dma_device_type": 2 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "dma_device_id": "system", 00:16:43.618 "dma_device_type": 1 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.618 "dma_device_type": 2 00:16:43.618 } 00:16:43.618 ], 00:16:43.618 "driver_specific": { 00:16:43.618 "raid": { 00:16:43.618 "uuid": "d6b29084-e281-4fd7-819c-519e4588663f", 00:16:43.618 "strip_size_kb": 0, 00:16:43.618 "state": "online", 00:16:43.618 "raid_level": "raid1", 00:16:43.618 "superblock": false, 00:16:43.618 "num_base_bdevs": 3, 00:16:43.618 "num_base_bdevs_discovered": 3, 00:16:43.618 "num_base_bdevs_operational": 3, 00:16:43.618 "base_bdevs_list": [ 00:16:43.618 { 00:16:43.618 "name": "BaseBdev1", 00:16:43.618 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:43.618 "is_configured": true, 00:16:43.618 "data_offset": 0, 00:16:43.618 "data_size": 65536 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "name": "BaseBdev2", 00:16:43.618 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:43.618 "is_configured": true, 00:16:43.618 "data_offset": 0, 00:16:43.618 "data_size": 65536 00:16:43.618 }, 00:16:43.618 { 00:16:43.618 "name": "BaseBdev3", 00:16:43.618 "uuid": "96a65bd6-a88a-4ea6-8161-2d76428361e4", 00:16:43.618 "is_configured": true, 00:16:43.618 "data_offset": 0, 00:16:43.618 "data_size": 65536 00:16:43.618 } 00:16:43.618 ] 00:16:43.618 } 00:16:43.618 } 00:16:43.618 }' 00:16:43.618 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:43.618 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:43.618 BaseBdev2 00:16:43.618 BaseBdev3' 00:16:43.618 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:43.618 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:43.618 15:53:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:43.877 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:43.877 "name": "BaseBdev1", 00:16:43.877 "aliases": [ 00:16:43.877 "1b630d8c-a4c2-4d02-88bb-13154ca874fe" 00:16:43.877 ], 00:16:43.877 "product_name": "Malloc disk", 00:16:43.877 "block_size": 512, 00:16:43.877 "num_blocks": 65536, 00:16:43.877 "uuid": "1b630d8c-a4c2-4d02-88bb-13154ca874fe", 00:16:43.877 "assigned_rate_limits": { 00:16:43.877 "rw_ios_per_sec": 0, 00:16:43.877 "rw_mbytes_per_sec": 0, 00:16:43.877 "r_mbytes_per_sec": 0, 00:16:43.877 "w_mbytes_per_sec": 0 00:16:43.877 }, 00:16:43.877 "claimed": true, 00:16:43.877 "claim_type": "exclusive_write", 00:16:43.877 "zoned": false, 00:16:43.877 "supported_io_types": { 00:16:43.877 "read": true, 00:16:43.877 "write": true, 00:16:43.877 "unmap": true, 00:16:43.877 "flush": true, 00:16:43.877 "reset": true, 00:16:43.877 "nvme_admin": false, 00:16:43.877 "nvme_io": false, 00:16:43.877 "nvme_io_md": false, 00:16:43.877 "write_zeroes": true, 00:16:43.877 "zcopy": true, 00:16:43.877 "get_zone_info": false, 00:16:43.877 "zone_management": false, 00:16:43.877 "zone_append": false, 00:16:43.877 "compare": false, 00:16:43.877 "compare_and_write": false, 00:16:43.877 "abort": true, 00:16:43.877 "seek_hole": false, 00:16:43.878 "seek_data": false, 00:16:43.878 "copy": true, 00:16:43.878 "nvme_iov_md": false 00:16:43.878 }, 00:16:43.878 "memory_domains": [ 00:16:43.878 { 00:16:43.878 "dma_device_id": "system", 00:16:43.878 "dma_device_type": 1 00:16:43.878 }, 00:16:43.878 { 00:16:43.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:43.878 "dma_device_type": 2 00:16:43.878 } 00:16:43.878 ], 00:16:43.878 "driver_specific": {} 00:16:43.878 }' 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:43.878 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.136 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:44.137 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.396 "name": "BaseBdev2", 00:16:44.396 "aliases": [ 00:16:44.396 "402bdb14-e529-448c-b0a7-2552ebddafc5" 00:16:44.396 ], 00:16:44.396 "product_name": "Malloc disk", 00:16:44.396 "block_size": 512, 00:16:44.396 "num_blocks": 65536, 00:16:44.396 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:44.396 "assigned_rate_limits": { 00:16:44.396 "rw_ios_per_sec": 0, 00:16:44.396 "rw_mbytes_per_sec": 0, 00:16:44.396 "r_mbytes_per_sec": 0, 00:16:44.396 "w_mbytes_per_sec": 0 00:16:44.396 }, 00:16:44.396 "claimed": true, 00:16:44.396 "claim_type": "exclusive_write", 00:16:44.396 "zoned": false, 00:16:44.396 "supported_io_types": { 00:16:44.396 "read": true, 00:16:44.396 "write": true, 00:16:44.396 "unmap": true, 00:16:44.396 "flush": true, 00:16:44.396 "reset": true, 00:16:44.396 "nvme_admin": false, 00:16:44.396 "nvme_io": false, 00:16:44.396 "nvme_io_md": false, 00:16:44.396 "write_zeroes": true, 00:16:44.396 "zcopy": true, 00:16:44.396 "get_zone_info": false, 00:16:44.396 "zone_management": false, 00:16:44.396 "zone_append": false, 00:16:44.396 "compare": false, 00:16:44.396 "compare_and_write": false, 00:16:44.396 "abort": true, 00:16:44.396 "seek_hole": false, 00:16:44.396 "seek_data": false, 00:16:44.396 "copy": true, 00:16:44.396 "nvme_iov_md": false 00:16:44.396 }, 00:16:44.396 "memory_domains": [ 00:16:44.396 { 00:16:44.396 "dma_device_id": "system", 00:16:44.396 "dma_device_type": 1 00:16:44.396 }, 00:16:44.396 { 00:16:44.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.396 "dma_device_type": 2 00:16:44.396 } 00:16:44.396 ], 00:16:44.396 "driver_specific": {} 00:16:44.396 }' 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:44.396 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.655 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:44.656 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:44.656 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.656 15:53:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:44.656 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:44.656 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:44.656 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:44.656 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:44.916 "name": "BaseBdev3", 00:16:44.916 "aliases": [ 00:16:44.916 "96a65bd6-a88a-4ea6-8161-2d76428361e4" 00:16:44.916 ], 00:16:44.916 "product_name": "Malloc disk", 00:16:44.916 "block_size": 512, 00:16:44.916 "num_blocks": 65536, 00:16:44.916 "uuid": "96a65bd6-a88a-4ea6-8161-2d76428361e4", 00:16:44.916 "assigned_rate_limits": { 00:16:44.916 "rw_ios_per_sec": 0, 00:16:44.916 "rw_mbytes_per_sec": 0, 00:16:44.916 "r_mbytes_per_sec": 0, 00:16:44.916 "w_mbytes_per_sec": 0 00:16:44.916 }, 00:16:44.916 "claimed": true, 00:16:44.916 "claim_type": "exclusive_write", 00:16:44.916 "zoned": false, 00:16:44.916 "supported_io_types": { 00:16:44.916 "read": true, 00:16:44.916 "write": true, 00:16:44.916 "unmap": true, 00:16:44.916 "flush": true, 00:16:44.916 "reset": true, 00:16:44.916 "nvme_admin": false, 00:16:44.916 "nvme_io": false, 00:16:44.916 "nvme_io_md": false, 00:16:44.916 "write_zeroes": true, 00:16:44.916 "zcopy": true, 00:16:44.916 "get_zone_info": false, 00:16:44.916 "zone_management": false, 00:16:44.916 "zone_append": false, 00:16:44.916 "compare": false, 00:16:44.916 "compare_and_write": false, 00:16:44.916 "abort": true, 00:16:44.916 "seek_hole": false, 00:16:44.916 "seek_data": false, 00:16:44.916 "copy": true, 00:16:44.916 "nvme_iov_md": false 00:16:44.916 }, 00:16:44.916 "memory_domains": [ 00:16:44.916 { 00:16:44.916 "dma_device_id": "system", 00:16:44.916 "dma_device_type": 1 00:16:44.916 }, 00:16:44.916 { 00:16:44.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.916 "dma_device_type": 2 00:16:44.916 } 00:16:44.916 ], 00:16:44.916 "driver_specific": {} 00:16:44.916 }' 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:44.916 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:45.176 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:45.435 [2024-07-12 15:53:05.700644] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.435 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.695 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.695 "name": "Existed_Raid", 00:16:45.695 "uuid": "d6b29084-e281-4fd7-819c-519e4588663f", 00:16:45.695 "strip_size_kb": 0, 00:16:45.695 "state": "online", 00:16:45.695 "raid_level": "raid1", 00:16:45.695 "superblock": false, 00:16:45.695 "num_base_bdevs": 3, 00:16:45.695 "num_base_bdevs_discovered": 2, 00:16:45.695 "num_base_bdevs_operational": 2, 00:16:45.695 "base_bdevs_list": [ 00:16:45.695 { 00:16:45.695 "name": null, 00:16:45.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.695 "is_configured": false, 00:16:45.695 "data_offset": 0, 00:16:45.695 "data_size": 65536 00:16:45.695 }, 00:16:45.695 { 00:16:45.695 "name": "BaseBdev2", 00:16:45.695 "uuid": "402bdb14-e529-448c-b0a7-2552ebddafc5", 00:16:45.695 "is_configured": true, 00:16:45.695 "data_offset": 0, 00:16:45.695 "data_size": 65536 00:16:45.695 }, 00:16:45.695 { 00:16:45.695 "name": "BaseBdev3", 00:16:45.695 "uuid": "96a65bd6-a88a-4ea6-8161-2d76428361e4", 00:16:45.695 "is_configured": true, 00:16:45.695 "data_offset": 0, 00:16:45.695 "data_size": 65536 00:16:45.695 } 00:16:45.695 ] 00:16:45.695 }' 00:16:45.695 15:53:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.695 15:53:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:46.264 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:46.523 [2024-07-12 15:53:06.815474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:46.523 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:46.523 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:46.523 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.523 15:53:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:46.783 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:46.783 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:46.783 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:46.783 [2024-07-12 15:53:07.202373] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:46.784 [2024-07-12 15:53:07.202433] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:46.784 [2024-07-12 15:53:07.208499] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:46.784 [2024-07-12 15:53:07.208524] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:46.784 [2024-07-12 15:53:07.208530] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b21280 name Existed_Raid, state offline 00:16:46.784 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:46.784 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:46.784 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.784 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:47.043 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:47.043 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:47.043 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:47.044 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:47.044 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:47.044 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:47.303 BaseBdev2 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:47.303 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:47.562 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:47.562 [ 00:16:47.562 { 00:16:47.562 "name": "BaseBdev2", 00:16:47.562 "aliases": [ 00:16:47.562 "9c26ff4a-509b-4ef9-9f63-a622885ab7c7" 00:16:47.562 ], 00:16:47.562 "product_name": "Malloc disk", 00:16:47.562 "block_size": 512, 00:16:47.562 "num_blocks": 65536, 00:16:47.562 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:47.562 "assigned_rate_limits": { 00:16:47.562 "rw_ios_per_sec": 0, 00:16:47.562 "rw_mbytes_per_sec": 0, 00:16:47.562 "r_mbytes_per_sec": 0, 00:16:47.562 "w_mbytes_per_sec": 0 00:16:47.562 }, 00:16:47.562 "claimed": false, 00:16:47.562 "zoned": false, 00:16:47.562 "supported_io_types": { 00:16:47.562 "read": true, 00:16:47.562 "write": true, 00:16:47.562 "unmap": true, 00:16:47.562 "flush": true, 00:16:47.562 "reset": true, 00:16:47.562 "nvme_admin": false, 00:16:47.562 "nvme_io": false, 00:16:47.562 "nvme_io_md": false, 00:16:47.562 "write_zeroes": true, 00:16:47.562 "zcopy": true, 00:16:47.562 "get_zone_info": false, 00:16:47.562 "zone_management": false, 00:16:47.562 "zone_append": false, 00:16:47.562 "compare": false, 00:16:47.562 "compare_and_write": false, 00:16:47.562 "abort": true, 00:16:47.562 "seek_hole": false, 00:16:47.562 "seek_data": false, 00:16:47.562 "copy": true, 00:16:47.562 "nvme_iov_md": false 00:16:47.562 }, 00:16:47.562 "memory_domains": [ 00:16:47.562 { 00:16:47.562 "dma_device_id": "system", 00:16:47.562 "dma_device_type": 1 00:16:47.562 }, 00:16:47.562 { 00:16:47.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.562 "dma_device_type": 2 00:16:47.562 } 00:16:47.562 ], 00:16:47.562 "driver_specific": {} 00:16:47.562 } 00:16:47.562 ] 00:16:47.562 15:53:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:47.562 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:47.562 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:47.562 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:47.822 BaseBdev3 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:47.822 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.081 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:48.340 [ 00:16:48.340 { 00:16:48.340 "name": "BaseBdev3", 00:16:48.340 "aliases": [ 00:16:48.340 "d9dcfbc4-0e83-493c-b6ab-2151408e6021" 00:16:48.340 ], 00:16:48.340 "product_name": "Malloc disk", 00:16:48.340 "block_size": 512, 00:16:48.340 "num_blocks": 65536, 00:16:48.340 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:48.340 "assigned_rate_limits": { 00:16:48.340 "rw_ios_per_sec": 0, 00:16:48.340 "rw_mbytes_per_sec": 0, 00:16:48.340 "r_mbytes_per_sec": 0, 00:16:48.340 "w_mbytes_per_sec": 0 00:16:48.340 }, 00:16:48.340 "claimed": false, 00:16:48.340 "zoned": false, 00:16:48.340 "supported_io_types": { 00:16:48.340 "read": true, 00:16:48.340 "write": true, 00:16:48.340 "unmap": true, 00:16:48.340 "flush": true, 00:16:48.340 "reset": true, 00:16:48.341 "nvme_admin": false, 00:16:48.341 "nvme_io": false, 00:16:48.341 "nvme_io_md": false, 00:16:48.341 "write_zeroes": true, 00:16:48.341 "zcopy": true, 00:16:48.341 "get_zone_info": false, 00:16:48.341 "zone_management": false, 00:16:48.341 "zone_append": false, 00:16:48.341 "compare": false, 00:16:48.341 "compare_and_write": false, 00:16:48.341 "abort": true, 00:16:48.341 "seek_hole": false, 00:16:48.341 "seek_data": false, 00:16:48.341 "copy": true, 00:16:48.341 "nvme_iov_md": false 00:16:48.341 }, 00:16:48.341 "memory_domains": [ 00:16:48.341 { 00:16:48.341 "dma_device_id": "system", 00:16:48.341 "dma_device_type": 1 00:16:48.341 }, 00:16:48.341 { 00:16:48.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.341 "dma_device_type": 2 00:16:48.341 } 00:16:48.341 ], 00:16:48.341 "driver_specific": {} 00:16:48.341 } 00:16:48.341 ] 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:48.341 [2024-07-12 15:53:08.754277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:48.341 [2024-07-12 15:53:08.754306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:48.341 [2024-07-12 15:53:08.754320] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:48.341 [2024-07-12 15:53:08.755467] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.341 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.600 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.600 "name": "Existed_Raid", 00:16:48.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.600 "strip_size_kb": 0, 00:16:48.600 "state": "configuring", 00:16:48.600 "raid_level": "raid1", 00:16:48.600 "superblock": false, 00:16:48.600 "num_base_bdevs": 3, 00:16:48.600 "num_base_bdevs_discovered": 2, 00:16:48.600 "num_base_bdevs_operational": 3, 00:16:48.600 "base_bdevs_list": [ 00:16:48.600 { 00:16:48.600 "name": "BaseBdev1", 00:16:48.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.600 "is_configured": false, 00:16:48.600 "data_offset": 0, 00:16:48.600 "data_size": 0 00:16:48.600 }, 00:16:48.600 { 00:16:48.600 "name": "BaseBdev2", 00:16:48.600 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:48.600 "is_configured": true, 00:16:48.600 "data_offset": 0, 00:16:48.600 "data_size": 65536 00:16:48.600 }, 00:16:48.600 { 00:16:48.600 "name": "BaseBdev3", 00:16:48.600 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:48.600 "is_configured": true, 00:16:48.600 "data_offset": 0, 00:16:48.600 "data_size": 65536 00:16:48.600 } 00:16:48.600 ] 00:16:48.600 }' 00:16:48.600 15:53:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.600 15:53:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:49.168 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:49.429 [2024-07-12 15:53:09.652532] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.429 "name": "Existed_Raid", 00:16:49.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.429 "strip_size_kb": 0, 00:16:49.429 "state": "configuring", 00:16:49.429 "raid_level": "raid1", 00:16:49.429 "superblock": false, 00:16:49.429 "num_base_bdevs": 3, 00:16:49.429 "num_base_bdevs_discovered": 1, 00:16:49.429 "num_base_bdevs_operational": 3, 00:16:49.429 "base_bdevs_list": [ 00:16:49.429 { 00:16:49.429 "name": "BaseBdev1", 00:16:49.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.429 "is_configured": false, 00:16:49.429 "data_offset": 0, 00:16:49.429 "data_size": 0 00:16:49.429 }, 00:16:49.429 { 00:16:49.429 "name": null, 00:16:49.429 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:49.429 "is_configured": false, 00:16:49.429 "data_offset": 0, 00:16:49.429 "data_size": 65536 00:16:49.429 }, 00:16:49.429 { 00:16:49.429 "name": "BaseBdev3", 00:16:49.429 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:49.429 "is_configured": true, 00:16:49.429 "data_offset": 0, 00:16:49.429 "data_size": 65536 00:16:49.429 } 00:16:49.429 ] 00:16:49.429 }' 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.429 15:53:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:50.367 15:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.367 15:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:50.627 15:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:50.627 15:53:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:50.885 [2024-07-12 15:53:11.177332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:50.886 BaseBdev1 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:50.886 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:51.145 [ 00:16:51.145 { 00:16:51.145 "name": "BaseBdev1", 00:16:51.145 "aliases": [ 00:16:51.145 "9d81c94d-5f6c-4c67-b842-3d20e977f683" 00:16:51.145 ], 00:16:51.145 "product_name": "Malloc disk", 00:16:51.145 "block_size": 512, 00:16:51.145 "num_blocks": 65536, 00:16:51.145 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:51.145 "assigned_rate_limits": { 00:16:51.145 "rw_ios_per_sec": 0, 00:16:51.145 "rw_mbytes_per_sec": 0, 00:16:51.145 "r_mbytes_per_sec": 0, 00:16:51.145 "w_mbytes_per_sec": 0 00:16:51.145 }, 00:16:51.145 "claimed": true, 00:16:51.145 "claim_type": "exclusive_write", 00:16:51.145 "zoned": false, 00:16:51.145 "supported_io_types": { 00:16:51.145 "read": true, 00:16:51.145 "write": true, 00:16:51.145 "unmap": true, 00:16:51.145 "flush": true, 00:16:51.145 "reset": true, 00:16:51.145 "nvme_admin": false, 00:16:51.145 "nvme_io": false, 00:16:51.145 "nvme_io_md": false, 00:16:51.145 "write_zeroes": true, 00:16:51.145 "zcopy": true, 00:16:51.145 "get_zone_info": false, 00:16:51.145 "zone_management": false, 00:16:51.145 "zone_append": false, 00:16:51.145 "compare": false, 00:16:51.145 "compare_and_write": false, 00:16:51.145 "abort": true, 00:16:51.145 "seek_hole": false, 00:16:51.145 "seek_data": false, 00:16:51.145 "copy": true, 00:16:51.145 "nvme_iov_md": false 00:16:51.145 }, 00:16:51.145 "memory_domains": [ 00:16:51.145 { 00:16:51.145 "dma_device_id": "system", 00:16:51.145 "dma_device_type": 1 00:16:51.145 }, 00:16:51.145 { 00:16:51.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.145 "dma_device_type": 2 00:16:51.145 } 00:16:51.145 ], 00:16:51.145 "driver_specific": {} 00:16:51.145 } 00:16:51.145 ] 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.145 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:51.405 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:51.405 "name": "Existed_Raid", 00:16:51.405 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:51.405 "strip_size_kb": 0, 00:16:51.405 "state": "configuring", 00:16:51.405 "raid_level": "raid1", 00:16:51.405 "superblock": false, 00:16:51.405 "num_base_bdevs": 3, 00:16:51.405 "num_base_bdevs_discovered": 2, 00:16:51.405 "num_base_bdevs_operational": 3, 00:16:51.405 "base_bdevs_list": [ 00:16:51.405 { 00:16:51.405 "name": "BaseBdev1", 00:16:51.405 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:51.405 "is_configured": true, 00:16:51.405 "data_offset": 0, 00:16:51.405 "data_size": 65536 00:16:51.405 }, 00:16:51.405 { 00:16:51.405 "name": null, 00:16:51.405 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:51.405 "is_configured": false, 00:16:51.405 "data_offset": 0, 00:16:51.405 "data_size": 65536 00:16:51.405 }, 00:16:51.405 { 00:16:51.405 "name": "BaseBdev3", 00:16:51.405 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:51.405 "is_configured": true, 00:16:51.405 "data_offset": 0, 00:16:51.405 "data_size": 65536 00:16:51.405 } 00:16:51.405 ] 00:16:51.405 }' 00:16:51.405 15:53:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:51.405 15:53:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.974 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.974 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:52.233 [2024-07-12 15:53:12.657110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:52.233 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:52.234 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:52.234 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:52.234 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:52.493 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:52.493 "name": "Existed_Raid", 00:16:52.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:52.493 "strip_size_kb": 0, 00:16:52.493 "state": "configuring", 00:16:52.493 "raid_level": "raid1", 00:16:52.493 "superblock": false, 00:16:52.493 "num_base_bdevs": 3, 00:16:52.493 "num_base_bdevs_discovered": 1, 00:16:52.493 "num_base_bdevs_operational": 3, 00:16:52.493 "base_bdevs_list": [ 00:16:52.493 { 00:16:52.493 "name": "BaseBdev1", 00:16:52.493 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:52.493 "is_configured": true, 00:16:52.493 "data_offset": 0, 00:16:52.493 "data_size": 65536 00:16:52.493 }, 00:16:52.493 { 00:16:52.493 "name": null, 00:16:52.493 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:52.493 "is_configured": false, 00:16:52.493 "data_offset": 0, 00:16:52.493 "data_size": 65536 00:16:52.493 }, 00:16:52.493 { 00:16:52.493 "name": null, 00:16:52.493 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:52.493 "is_configured": false, 00:16:52.493 "data_offset": 0, 00:16:52.493 "data_size": 65536 00:16:52.493 } 00:16:52.493 ] 00:16:52.493 }' 00:16:52.493 15:53:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:52.493 15:53:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:53.063 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.063 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:53.322 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:53.322 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:53.582 [2024-07-12 15:53:13.787999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.582 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.582 "name": "Existed_Raid", 00:16:53.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.582 "strip_size_kb": 0, 00:16:53.582 "state": "configuring", 00:16:53.582 "raid_level": "raid1", 00:16:53.582 "superblock": false, 00:16:53.582 "num_base_bdevs": 3, 00:16:53.582 "num_base_bdevs_discovered": 2, 00:16:53.582 "num_base_bdevs_operational": 3, 00:16:53.582 "base_bdevs_list": [ 00:16:53.583 { 00:16:53.583 "name": "BaseBdev1", 00:16:53.583 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:53.583 "is_configured": true, 00:16:53.583 "data_offset": 0, 00:16:53.583 "data_size": 65536 00:16:53.583 }, 00:16:53.583 { 00:16:53.583 "name": null, 00:16:53.583 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:53.583 "is_configured": false, 00:16:53.583 "data_offset": 0, 00:16:53.583 "data_size": 65536 00:16:53.583 }, 00:16:53.583 { 00:16:53.583 "name": "BaseBdev3", 00:16:53.583 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:53.583 "is_configured": true, 00:16:53.583 "data_offset": 0, 00:16:53.583 "data_size": 65536 00:16:53.583 } 00:16:53.583 ] 00:16:53.583 }' 00:16:53.583 15:53:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.583 15:53:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.153 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.153 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:54.413 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:54.413 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:54.672 [2024-07-12 15:53:14.914872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:54.672 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:54.672 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.672 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.672 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.673 15:53:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.932 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.932 "name": "Existed_Raid", 00:16:54.932 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.932 "strip_size_kb": 0, 00:16:54.932 "state": "configuring", 00:16:54.932 "raid_level": "raid1", 00:16:54.932 "superblock": false, 00:16:54.932 "num_base_bdevs": 3, 00:16:54.932 "num_base_bdevs_discovered": 1, 00:16:54.932 "num_base_bdevs_operational": 3, 00:16:54.932 "base_bdevs_list": [ 00:16:54.932 { 00:16:54.932 "name": null, 00:16:54.932 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:54.932 "is_configured": false, 00:16:54.932 "data_offset": 0, 00:16:54.932 "data_size": 65536 00:16:54.932 }, 00:16:54.932 { 00:16:54.932 "name": null, 00:16:54.932 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:54.932 "is_configured": false, 00:16:54.932 "data_offset": 0, 00:16:54.932 "data_size": 65536 00:16:54.932 }, 00:16:54.932 { 00:16:54.932 "name": "BaseBdev3", 00:16:54.932 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:54.932 "is_configured": true, 00:16:54.932 "data_offset": 0, 00:16:54.932 "data_size": 65536 00:16:54.932 } 00:16:54.932 ] 00:16:54.932 }' 00:16:54.932 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.932 15:53:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.499 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.499 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:55.499 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:55.499 15:53:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:55.758 [2024-07-12 15:53:16.011510] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.758 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.759 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.759 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.017 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.017 "name": "Existed_Raid", 00:16:56.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:56.017 "strip_size_kb": 0, 00:16:56.017 "state": "configuring", 00:16:56.017 "raid_level": "raid1", 00:16:56.017 "superblock": false, 00:16:56.017 "num_base_bdevs": 3, 00:16:56.017 "num_base_bdevs_discovered": 2, 00:16:56.017 "num_base_bdevs_operational": 3, 00:16:56.017 "base_bdevs_list": [ 00:16:56.017 { 00:16:56.017 "name": null, 00:16:56.017 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:56.017 "is_configured": false, 00:16:56.017 "data_offset": 0, 00:16:56.017 "data_size": 65536 00:16:56.017 }, 00:16:56.017 { 00:16:56.017 "name": "BaseBdev2", 00:16:56.017 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:56.017 "is_configured": true, 00:16:56.017 "data_offset": 0, 00:16:56.017 "data_size": 65536 00:16:56.017 }, 00:16:56.017 { 00:16:56.017 "name": "BaseBdev3", 00:16:56.017 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:56.017 "is_configured": true, 00:16:56.017 "data_offset": 0, 00:16:56.017 "data_size": 65536 00:16:56.017 } 00:16:56.017 ] 00:16:56.017 }' 00:16:56.017 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.017 15:53:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:56.584 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.584 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:56.584 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:56.584 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.584 15:53:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:56.842 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9d81c94d-5f6c-4c67-b842-3d20e977f683 00:16:57.101 [2024-07-12 15:53:17.355859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:57.101 [2024-07-12 15:53:17.355892] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b20fa0 00:16:57.101 [2024-07-12 15:53:17.355897] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:57.101 [2024-07-12 15:53:17.356044] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b27b00 00:16:57.101 [2024-07-12 15:53:17.356137] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b20fa0 00:16:57.101 [2024-07-12 15:53:17.356142] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b20fa0 00:16:57.101 [2024-07-12 15:53:17.356261] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:57.101 NewBaseBdev 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:57.101 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:57.361 [ 00:16:57.361 { 00:16:57.361 "name": "NewBaseBdev", 00:16:57.361 "aliases": [ 00:16:57.361 "9d81c94d-5f6c-4c67-b842-3d20e977f683" 00:16:57.361 ], 00:16:57.361 "product_name": "Malloc disk", 00:16:57.361 "block_size": 512, 00:16:57.361 "num_blocks": 65536, 00:16:57.361 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:57.361 "assigned_rate_limits": { 00:16:57.361 "rw_ios_per_sec": 0, 00:16:57.361 "rw_mbytes_per_sec": 0, 00:16:57.361 "r_mbytes_per_sec": 0, 00:16:57.361 "w_mbytes_per_sec": 0 00:16:57.361 }, 00:16:57.361 "claimed": true, 00:16:57.361 "claim_type": "exclusive_write", 00:16:57.361 "zoned": false, 00:16:57.361 "supported_io_types": { 00:16:57.361 "read": true, 00:16:57.361 "write": true, 00:16:57.361 "unmap": true, 00:16:57.361 "flush": true, 00:16:57.361 "reset": true, 00:16:57.361 "nvme_admin": false, 00:16:57.361 "nvme_io": false, 00:16:57.361 "nvme_io_md": false, 00:16:57.361 "write_zeroes": true, 00:16:57.361 "zcopy": true, 00:16:57.361 "get_zone_info": false, 00:16:57.361 "zone_management": false, 00:16:57.361 "zone_append": false, 00:16:57.361 "compare": false, 00:16:57.361 "compare_and_write": false, 00:16:57.361 "abort": true, 00:16:57.361 "seek_hole": false, 00:16:57.361 "seek_data": false, 00:16:57.361 "copy": true, 00:16:57.361 "nvme_iov_md": false 00:16:57.361 }, 00:16:57.361 "memory_domains": [ 00:16:57.361 { 00:16:57.361 "dma_device_id": "system", 00:16:57.361 "dma_device_type": 1 00:16:57.361 }, 00:16:57.361 { 00:16:57.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.361 "dma_device_type": 2 00:16:57.361 } 00:16:57.361 ], 00:16:57.361 "driver_specific": {} 00:16:57.361 } 00:16:57.361 ] 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.361 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:57.621 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:57.621 "name": "Existed_Raid", 00:16:57.621 "uuid": "71b6f017-6e10-4469-bd5c-d096ddb6e360", 00:16:57.621 "strip_size_kb": 0, 00:16:57.621 "state": "online", 00:16:57.621 "raid_level": "raid1", 00:16:57.621 "superblock": false, 00:16:57.621 "num_base_bdevs": 3, 00:16:57.621 "num_base_bdevs_discovered": 3, 00:16:57.621 "num_base_bdevs_operational": 3, 00:16:57.621 "base_bdevs_list": [ 00:16:57.621 { 00:16:57.621 "name": "NewBaseBdev", 00:16:57.621 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:57.621 "is_configured": true, 00:16:57.622 "data_offset": 0, 00:16:57.622 "data_size": 65536 00:16:57.622 }, 00:16:57.622 { 00:16:57.622 "name": "BaseBdev2", 00:16:57.622 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:57.622 "is_configured": true, 00:16:57.622 "data_offset": 0, 00:16:57.622 "data_size": 65536 00:16:57.622 }, 00:16:57.622 { 00:16:57.622 "name": "BaseBdev3", 00:16:57.622 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:57.622 "is_configured": true, 00:16:57.622 "data_offset": 0, 00:16:57.622 "data_size": 65536 00:16:57.622 } 00:16:57.622 ] 00:16:57.622 }' 00:16:57.622 15:53:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:57.622 15:53:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:58.230 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:58.491 [2024-07-12 15:53:18.667592] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:58.491 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:58.491 "name": "Existed_Raid", 00:16:58.491 "aliases": [ 00:16:58.491 "71b6f017-6e10-4469-bd5c-d096ddb6e360" 00:16:58.491 ], 00:16:58.491 "product_name": "Raid Volume", 00:16:58.491 "block_size": 512, 00:16:58.491 "num_blocks": 65536, 00:16:58.491 "uuid": "71b6f017-6e10-4469-bd5c-d096ddb6e360", 00:16:58.491 "assigned_rate_limits": { 00:16:58.491 "rw_ios_per_sec": 0, 00:16:58.491 "rw_mbytes_per_sec": 0, 00:16:58.491 "r_mbytes_per_sec": 0, 00:16:58.491 "w_mbytes_per_sec": 0 00:16:58.491 }, 00:16:58.491 "claimed": false, 00:16:58.491 "zoned": false, 00:16:58.491 "supported_io_types": { 00:16:58.491 "read": true, 00:16:58.491 "write": true, 00:16:58.491 "unmap": false, 00:16:58.491 "flush": false, 00:16:58.491 "reset": true, 00:16:58.491 "nvme_admin": false, 00:16:58.491 "nvme_io": false, 00:16:58.491 "nvme_io_md": false, 00:16:58.491 "write_zeroes": true, 00:16:58.491 "zcopy": false, 00:16:58.491 "get_zone_info": false, 00:16:58.491 "zone_management": false, 00:16:58.491 "zone_append": false, 00:16:58.491 "compare": false, 00:16:58.491 "compare_and_write": false, 00:16:58.491 "abort": false, 00:16:58.491 "seek_hole": false, 00:16:58.491 "seek_data": false, 00:16:58.491 "copy": false, 00:16:58.491 "nvme_iov_md": false 00:16:58.491 }, 00:16:58.491 "memory_domains": [ 00:16:58.491 { 00:16:58.491 "dma_device_id": "system", 00:16:58.491 "dma_device_type": 1 00:16:58.491 }, 00:16:58.491 { 00:16:58.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.492 "dma_device_type": 2 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "dma_device_id": "system", 00:16:58.492 "dma_device_type": 1 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.492 "dma_device_type": 2 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "dma_device_id": "system", 00:16:58.492 "dma_device_type": 1 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.492 "dma_device_type": 2 00:16:58.492 } 00:16:58.492 ], 00:16:58.492 "driver_specific": { 00:16:58.492 "raid": { 00:16:58.492 "uuid": "71b6f017-6e10-4469-bd5c-d096ddb6e360", 00:16:58.492 "strip_size_kb": 0, 00:16:58.492 "state": "online", 00:16:58.492 "raid_level": "raid1", 00:16:58.492 "superblock": false, 00:16:58.492 "num_base_bdevs": 3, 00:16:58.492 "num_base_bdevs_discovered": 3, 00:16:58.492 "num_base_bdevs_operational": 3, 00:16:58.492 "base_bdevs_list": [ 00:16:58.492 { 00:16:58.492 "name": "NewBaseBdev", 00:16:58.492 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:58.492 "is_configured": true, 00:16:58.492 "data_offset": 0, 00:16:58.492 "data_size": 65536 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "name": "BaseBdev2", 00:16:58.492 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:58.492 "is_configured": true, 00:16:58.492 "data_offset": 0, 00:16:58.492 "data_size": 65536 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "name": "BaseBdev3", 00:16:58.492 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:58.492 "is_configured": true, 00:16:58.492 "data_offset": 0, 00:16:58.492 "data_size": 65536 00:16:58.492 } 00:16:58.492 ] 00:16:58.492 } 00:16:58.492 } 00:16:58.492 }' 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:58.492 BaseBdev2 00:16:58.492 BaseBdev3' 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:58.492 "name": "NewBaseBdev", 00:16:58.492 "aliases": [ 00:16:58.492 "9d81c94d-5f6c-4c67-b842-3d20e977f683" 00:16:58.492 ], 00:16:58.492 "product_name": "Malloc disk", 00:16:58.492 "block_size": 512, 00:16:58.492 "num_blocks": 65536, 00:16:58.492 "uuid": "9d81c94d-5f6c-4c67-b842-3d20e977f683", 00:16:58.492 "assigned_rate_limits": { 00:16:58.492 "rw_ios_per_sec": 0, 00:16:58.492 "rw_mbytes_per_sec": 0, 00:16:58.492 "r_mbytes_per_sec": 0, 00:16:58.492 "w_mbytes_per_sec": 0 00:16:58.492 }, 00:16:58.492 "claimed": true, 00:16:58.492 "claim_type": "exclusive_write", 00:16:58.492 "zoned": false, 00:16:58.492 "supported_io_types": { 00:16:58.492 "read": true, 00:16:58.492 "write": true, 00:16:58.492 "unmap": true, 00:16:58.492 "flush": true, 00:16:58.492 "reset": true, 00:16:58.492 "nvme_admin": false, 00:16:58.492 "nvme_io": false, 00:16:58.492 "nvme_io_md": false, 00:16:58.492 "write_zeroes": true, 00:16:58.492 "zcopy": true, 00:16:58.492 "get_zone_info": false, 00:16:58.492 "zone_management": false, 00:16:58.492 "zone_append": false, 00:16:58.492 "compare": false, 00:16:58.492 "compare_and_write": false, 00:16:58.492 "abort": true, 00:16:58.492 "seek_hole": false, 00:16:58.492 "seek_data": false, 00:16:58.492 "copy": true, 00:16:58.492 "nvme_iov_md": false 00:16:58.492 }, 00:16:58.492 "memory_domains": [ 00:16:58.492 { 00:16:58.492 "dma_device_id": "system", 00:16:58.492 "dma_device_type": 1 00:16:58.492 }, 00:16:58.492 { 00:16:58.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:58.492 "dma_device_type": 2 00:16:58.492 } 00:16:58.492 ], 00:16:58.492 "driver_specific": {} 00:16:58.492 }' 00:16:58.492 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.751 15:53:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:58.751 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:59.010 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.270 "name": "BaseBdev2", 00:16:59.270 "aliases": [ 00:16:59.270 "9c26ff4a-509b-4ef9-9f63-a622885ab7c7" 00:16:59.270 ], 00:16:59.270 "product_name": "Malloc disk", 00:16:59.270 "block_size": 512, 00:16:59.270 "num_blocks": 65536, 00:16:59.270 "uuid": "9c26ff4a-509b-4ef9-9f63-a622885ab7c7", 00:16:59.270 "assigned_rate_limits": { 00:16:59.270 "rw_ios_per_sec": 0, 00:16:59.270 "rw_mbytes_per_sec": 0, 00:16:59.270 "r_mbytes_per_sec": 0, 00:16:59.270 "w_mbytes_per_sec": 0 00:16:59.270 }, 00:16:59.270 "claimed": true, 00:16:59.270 "claim_type": "exclusive_write", 00:16:59.270 "zoned": false, 00:16:59.270 "supported_io_types": { 00:16:59.270 "read": true, 00:16:59.270 "write": true, 00:16:59.270 "unmap": true, 00:16:59.270 "flush": true, 00:16:59.270 "reset": true, 00:16:59.270 "nvme_admin": false, 00:16:59.270 "nvme_io": false, 00:16:59.270 "nvme_io_md": false, 00:16:59.270 "write_zeroes": true, 00:16:59.270 "zcopy": true, 00:16:59.270 "get_zone_info": false, 00:16:59.270 "zone_management": false, 00:16:59.270 "zone_append": false, 00:16:59.270 "compare": false, 00:16:59.270 "compare_and_write": false, 00:16:59.270 "abort": true, 00:16:59.270 "seek_hole": false, 00:16:59.270 "seek_data": false, 00:16:59.270 "copy": true, 00:16:59.270 "nvme_iov_md": false 00:16:59.270 }, 00:16:59.270 "memory_domains": [ 00:16:59.270 { 00:16:59.270 "dma_device_id": "system", 00:16:59.270 "dma_device_type": 1 00:16:59.270 }, 00:16:59.270 { 00:16:59.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.270 "dma_device_type": 2 00:16:59.270 } 00:16:59.270 ], 00:16:59.270 "driver_specific": {} 00:16:59.270 }' 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.270 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:59.530 15:53:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:59.790 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:59.790 "name": "BaseBdev3", 00:16:59.790 "aliases": [ 00:16:59.790 "d9dcfbc4-0e83-493c-b6ab-2151408e6021" 00:16:59.790 ], 00:16:59.790 "product_name": "Malloc disk", 00:16:59.790 "block_size": 512, 00:16:59.790 "num_blocks": 65536, 00:16:59.790 "uuid": "d9dcfbc4-0e83-493c-b6ab-2151408e6021", 00:16:59.790 "assigned_rate_limits": { 00:16:59.790 "rw_ios_per_sec": 0, 00:16:59.790 "rw_mbytes_per_sec": 0, 00:16:59.790 "r_mbytes_per_sec": 0, 00:16:59.790 "w_mbytes_per_sec": 0 00:16:59.790 }, 00:16:59.790 "claimed": true, 00:16:59.790 "claim_type": "exclusive_write", 00:16:59.790 "zoned": false, 00:16:59.790 "supported_io_types": { 00:16:59.790 "read": true, 00:16:59.790 "write": true, 00:16:59.790 "unmap": true, 00:16:59.790 "flush": true, 00:16:59.790 "reset": true, 00:16:59.790 "nvme_admin": false, 00:16:59.790 "nvme_io": false, 00:16:59.790 "nvme_io_md": false, 00:16:59.790 "write_zeroes": true, 00:16:59.790 "zcopy": true, 00:16:59.790 "get_zone_info": false, 00:16:59.790 "zone_management": false, 00:16:59.790 "zone_append": false, 00:16:59.790 "compare": false, 00:16:59.790 "compare_and_write": false, 00:16:59.790 "abort": true, 00:16:59.790 "seek_hole": false, 00:16:59.790 "seek_data": false, 00:16:59.790 "copy": true, 00:16:59.790 "nvme_iov_md": false 00:16:59.791 }, 00:16:59.791 "memory_domains": [ 00:16:59.791 { 00:16:59.791 "dma_device_id": "system", 00:16:59.791 "dma_device_type": 1 00:16:59.791 }, 00:16:59.791 { 00:16:59.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:59.791 "dma_device_type": 2 00:16:59.791 } 00:16:59.791 ], 00:16:59.791 "driver_specific": {} 00:16:59.791 }' 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:59.791 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:00.050 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:00.050 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.050 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:00.050 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:00.050 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:00.310 [2024-07-12 15:53:20.524072] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:00.310 [2024-07-12 15:53:20.524088] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:00.310 [2024-07-12 15:53:20.524125] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:00.310 [2024-07-12 15:53:20.524330] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:00.310 [2024-07-12 15:53:20.524336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b20fa0 name Existed_Raid, state offline 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2552185 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2552185 ']' 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2552185 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2552185 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2552185' 00:17:00.310 killing process with pid 2552185 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2552185 00:17:00.310 [2024-07-12 15:53:20.592062] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2552185 00:17:00.310 [2024-07-12 15:53:20.606810] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:00.310 00:17:00.310 real 0m24.184s 00:17:00.310 user 0m45.315s 00:17:00.310 sys 0m3.583s 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:00.310 15:53:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.310 ************************************ 00:17:00.310 END TEST raid_state_function_test 00:17:00.310 ************************************ 00:17:00.570 15:53:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:00.570 15:53:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:17:00.570 15:53:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:00.570 15:53:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:00.570 15:53:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:00.570 ************************************ 00:17:00.570 START TEST raid_state_function_test_sb 00:17:00.570 ************************************ 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2556777 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2556777' 00:17:00.570 Process raid pid: 2556777 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2556777 /var/tmp/spdk-raid.sock 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2556777 ']' 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:00.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:00.570 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.570 [2024-07-12 15:53:20.875386] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:17:00.570 [2024-07-12 15:53:20.875450] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:00.570 [2024-07-12 15:53:20.965069] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:00.829 [2024-07-12 15:53:21.033441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:00.829 [2024-07-12 15:53:21.073391] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:00.829 [2024-07-12 15:53:21.073413] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:01.770 15:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:01.770 15:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:17:01.770 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:02.340 [2024-07-12 15:53:22.570194] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:02.340 [2024-07-12 15:53:22.570223] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:02.340 [2024-07-12 15:53:22.570229] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:02.340 [2024-07-12 15:53:22.570235] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:02.340 [2024-07-12 15:53:22.570239] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:02.340 [2024-07-12 15:53:22.570245] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.340 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.599 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.599 "name": "Existed_Raid", 00:17:02.600 "uuid": "d742a104-78b9-4ff1-b950-73441335425f", 00:17:02.600 "strip_size_kb": 0, 00:17:02.600 "state": "configuring", 00:17:02.600 "raid_level": "raid1", 00:17:02.600 "superblock": true, 00:17:02.600 "num_base_bdevs": 3, 00:17:02.600 "num_base_bdevs_discovered": 0, 00:17:02.600 "num_base_bdevs_operational": 3, 00:17:02.600 "base_bdevs_list": [ 00:17:02.600 { 00:17:02.600 "name": "BaseBdev1", 00:17:02.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.600 "is_configured": false, 00:17:02.600 "data_offset": 0, 00:17:02.600 "data_size": 0 00:17:02.600 }, 00:17:02.600 { 00:17:02.600 "name": "BaseBdev2", 00:17:02.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.600 "is_configured": false, 00:17:02.600 "data_offset": 0, 00:17:02.600 "data_size": 0 00:17:02.600 }, 00:17:02.600 { 00:17:02.600 "name": "BaseBdev3", 00:17:02.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.600 "is_configured": false, 00:17:02.600 "data_offset": 0, 00:17:02.600 "data_size": 0 00:17:02.600 } 00:17:02.600 ] 00:17:02.600 }' 00:17:02.600 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.600 15:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.170 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:03.170 [2024-07-12 15:53:23.484384] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:03.170 [2024-07-12 15:53:23.484402] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x201b900 name Existed_Raid, state configuring 00:17:03.170 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:03.429 [2024-07-12 15:53:23.672884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:03.429 [2024-07-12 15:53:23.672899] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:03.429 [2024-07-12 15:53:23.672905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:03.429 [2024-07-12 15:53:23.672910] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:03.429 [2024-07-12 15:53:23.672915] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:03.429 [2024-07-12 15:53:23.672920] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:03.430 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:03.430 [2024-07-12 15:53:23.872078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:03.430 BaseBdev1 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:03.690 15:53:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.690 15:53:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:03.950 [ 00:17:03.950 { 00:17:03.950 "name": "BaseBdev1", 00:17:03.950 "aliases": [ 00:17:03.950 "487ee9d3-0628-48e4-afab-cd0a3a7aef90" 00:17:03.950 ], 00:17:03.950 "product_name": "Malloc disk", 00:17:03.950 "block_size": 512, 00:17:03.950 "num_blocks": 65536, 00:17:03.950 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:03.950 "assigned_rate_limits": { 00:17:03.950 "rw_ios_per_sec": 0, 00:17:03.950 "rw_mbytes_per_sec": 0, 00:17:03.950 "r_mbytes_per_sec": 0, 00:17:03.950 "w_mbytes_per_sec": 0 00:17:03.950 }, 00:17:03.950 "claimed": true, 00:17:03.950 "claim_type": "exclusive_write", 00:17:03.950 "zoned": false, 00:17:03.950 "supported_io_types": { 00:17:03.950 "read": true, 00:17:03.950 "write": true, 00:17:03.950 "unmap": true, 00:17:03.950 "flush": true, 00:17:03.950 "reset": true, 00:17:03.950 "nvme_admin": false, 00:17:03.950 "nvme_io": false, 00:17:03.950 "nvme_io_md": false, 00:17:03.950 "write_zeroes": true, 00:17:03.950 "zcopy": true, 00:17:03.950 "get_zone_info": false, 00:17:03.950 "zone_management": false, 00:17:03.950 "zone_append": false, 00:17:03.950 "compare": false, 00:17:03.950 "compare_and_write": false, 00:17:03.950 "abort": true, 00:17:03.950 "seek_hole": false, 00:17:03.950 "seek_data": false, 00:17:03.950 "copy": true, 00:17:03.950 "nvme_iov_md": false 00:17:03.950 }, 00:17:03.950 "memory_domains": [ 00:17:03.950 { 00:17:03.950 "dma_device_id": "system", 00:17:03.950 "dma_device_type": 1 00:17:03.950 }, 00:17:03.950 { 00:17:03.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.950 "dma_device_type": 2 00:17:03.950 } 00:17:03.950 ], 00:17:03.950 "driver_specific": {} 00:17:03.950 } 00:17:03.950 ] 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.950 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.211 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.211 "name": "Existed_Raid", 00:17:04.211 "uuid": "a736233c-7757-4555-b7ad-5fa313ccb904", 00:17:04.211 "strip_size_kb": 0, 00:17:04.211 "state": "configuring", 00:17:04.211 "raid_level": "raid1", 00:17:04.211 "superblock": true, 00:17:04.211 "num_base_bdevs": 3, 00:17:04.211 "num_base_bdevs_discovered": 1, 00:17:04.211 "num_base_bdevs_operational": 3, 00:17:04.211 "base_bdevs_list": [ 00:17:04.211 { 00:17:04.211 "name": "BaseBdev1", 00:17:04.211 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:04.211 "is_configured": true, 00:17:04.211 "data_offset": 2048, 00:17:04.211 "data_size": 63488 00:17:04.211 }, 00:17:04.211 { 00:17:04.211 "name": "BaseBdev2", 00:17:04.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.211 "is_configured": false, 00:17:04.211 "data_offset": 0, 00:17:04.211 "data_size": 0 00:17:04.211 }, 00:17:04.211 { 00:17:04.211 "name": "BaseBdev3", 00:17:04.211 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.211 "is_configured": false, 00:17:04.211 "data_offset": 0, 00:17:04.211 "data_size": 0 00:17:04.211 } 00:17:04.211 ] 00:17:04.211 }' 00:17:04.211 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.211 15:53:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:04.780 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:04.780 [2024-07-12 15:53:25.183368] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:04.780 [2024-07-12 15:53:25.183395] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x201b190 name Existed_Raid, state configuring 00:17:04.780 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:05.350 [2024-07-12 15:53:25.708699] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:05.350 [2024-07-12 15:53:25.709799] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:05.350 [2024-07-12 15:53:25.709821] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:05.350 [2024-07-12 15:53:25.709826] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:05.350 [2024-07-12 15:53:25.709832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.350 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.610 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.610 "name": "Existed_Raid", 00:17:05.610 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:05.610 "strip_size_kb": 0, 00:17:05.610 "state": "configuring", 00:17:05.610 "raid_level": "raid1", 00:17:05.610 "superblock": true, 00:17:05.610 "num_base_bdevs": 3, 00:17:05.610 "num_base_bdevs_discovered": 1, 00:17:05.610 "num_base_bdevs_operational": 3, 00:17:05.610 "base_bdevs_list": [ 00:17:05.610 { 00:17:05.610 "name": "BaseBdev1", 00:17:05.610 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:05.610 "is_configured": true, 00:17:05.610 "data_offset": 2048, 00:17:05.610 "data_size": 63488 00:17:05.610 }, 00:17:05.610 { 00:17:05.610 "name": "BaseBdev2", 00:17:05.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.610 "is_configured": false, 00:17:05.610 "data_offset": 0, 00:17:05.610 "data_size": 0 00:17:05.610 }, 00:17:05.610 { 00:17:05.610 "name": "BaseBdev3", 00:17:05.610 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:05.610 "is_configured": false, 00:17:05.610 "data_offset": 0, 00:17:05.610 "data_size": 0 00:17:05.610 } 00:17:05.610 ] 00:17:05.610 }' 00:17:05.610 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.610 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:06.179 15:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:06.439 [2024-07-12 15:53:26.655986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:06.439 BaseBdev2 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:06.439 15:53:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:06.699 [ 00:17:06.699 { 00:17:06.699 "name": "BaseBdev2", 00:17:06.699 "aliases": [ 00:17:06.699 "5f33135e-bbd5-4cd9-9811-4c4ef25e6469" 00:17:06.699 ], 00:17:06.699 "product_name": "Malloc disk", 00:17:06.699 "block_size": 512, 00:17:06.699 "num_blocks": 65536, 00:17:06.699 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:06.699 "assigned_rate_limits": { 00:17:06.699 "rw_ios_per_sec": 0, 00:17:06.699 "rw_mbytes_per_sec": 0, 00:17:06.699 "r_mbytes_per_sec": 0, 00:17:06.699 "w_mbytes_per_sec": 0 00:17:06.699 }, 00:17:06.699 "claimed": true, 00:17:06.699 "claim_type": "exclusive_write", 00:17:06.699 "zoned": false, 00:17:06.699 "supported_io_types": { 00:17:06.699 "read": true, 00:17:06.699 "write": true, 00:17:06.699 "unmap": true, 00:17:06.699 "flush": true, 00:17:06.699 "reset": true, 00:17:06.699 "nvme_admin": false, 00:17:06.699 "nvme_io": false, 00:17:06.699 "nvme_io_md": false, 00:17:06.699 "write_zeroes": true, 00:17:06.699 "zcopy": true, 00:17:06.699 "get_zone_info": false, 00:17:06.699 "zone_management": false, 00:17:06.699 "zone_append": false, 00:17:06.699 "compare": false, 00:17:06.699 "compare_and_write": false, 00:17:06.699 "abort": true, 00:17:06.699 "seek_hole": false, 00:17:06.699 "seek_data": false, 00:17:06.699 "copy": true, 00:17:06.699 "nvme_iov_md": false 00:17:06.699 }, 00:17:06.699 "memory_domains": [ 00:17:06.699 { 00:17:06.699 "dma_device_id": "system", 00:17:06.699 "dma_device_type": 1 00:17:06.699 }, 00:17:06.699 { 00:17:06.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.699 "dma_device_type": 2 00:17:06.699 } 00:17:06.699 ], 00:17:06.699 "driver_specific": {} 00:17:06.699 } 00:17:06.699 ] 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.699 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:06.960 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.960 "name": "Existed_Raid", 00:17:06.960 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:06.960 "strip_size_kb": 0, 00:17:06.960 "state": "configuring", 00:17:06.960 "raid_level": "raid1", 00:17:06.960 "superblock": true, 00:17:06.960 "num_base_bdevs": 3, 00:17:06.960 "num_base_bdevs_discovered": 2, 00:17:06.960 "num_base_bdevs_operational": 3, 00:17:06.960 "base_bdevs_list": [ 00:17:06.960 { 00:17:06.960 "name": "BaseBdev1", 00:17:06.960 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:06.960 "is_configured": true, 00:17:06.960 "data_offset": 2048, 00:17:06.960 "data_size": 63488 00:17:06.960 }, 00:17:06.960 { 00:17:06.960 "name": "BaseBdev2", 00:17:06.960 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:06.960 "is_configured": true, 00:17:06.960 "data_offset": 2048, 00:17:06.960 "data_size": 63488 00:17:06.960 }, 00:17:06.960 { 00:17:06.960 "name": "BaseBdev3", 00:17:06.960 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.960 "is_configured": false, 00:17:06.960 "data_offset": 0, 00:17:06.960 "data_size": 0 00:17:06.960 } 00:17:06.960 ] 00:17:06.960 }' 00:17:06.960 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.960 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:07.528 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:07.528 [2024-07-12 15:53:27.964146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:07.528 [2024-07-12 15:53:27.964266] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x201c280 00:17:07.528 [2024-07-12 15:53:27.964274] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:07.528 [2024-07-12 15:53:27.964409] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x201bd70 00:17:07.528 [2024-07-12 15:53:27.964503] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x201c280 00:17:07.528 [2024-07-12 15:53:27.964508] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x201c280 00:17:07.528 [2024-07-12 15:53:27.964579] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:07.528 BaseBdev3 00:17:07.787 15:53:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:07.788 15:53:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:07.788 15:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:08.048 [ 00:17:08.048 { 00:17:08.048 "name": "BaseBdev3", 00:17:08.048 "aliases": [ 00:17:08.048 "5e988186-1396-4901-ab40-7a7e3fd884a1" 00:17:08.048 ], 00:17:08.048 "product_name": "Malloc disk", 00:17:08.048 "block_size": 512, 00:17:08.048 "num_blocks": 65536, 00:17:08.048 "uuid": "5e988186-1396-4901-ab40-7a7e3fd884a1", 00:17:08.048 "assigned_rate_limits": { 00:17:08.048 "rw_ios_per_sec": 0, 00:17:08.048 "rw_mbytes_per_sec": 0, 00:17:08.048 "r_mbytes_per_sec": 0, 00:17:08.048 "w_mbytes_per_sec": 0 00:17:08.048 }, 00:17:08.048 "claimed": true, 00:17:08.048 "claim_type": "exclusive_write", 00:17:08.048 "zoned": false, 00:17:08.048 "supported_io_types": { 00:17:08.048 "read": true, 00:17:08.048 "write": true, 00:17:08.048 "unmap": true, 00:17:08.048 "flush": true, 00:17:08.048 "reset": true, 00:17:08.048 "nvme_admin": false, 00:17:08.048 "nvme_io": false, 00:17:08.048 "nvme_io_md": false, 00:17:08.048 "write_zeroes": true, 00:17:08.048 "zcopy": true, 00:17:08.048 "get_zone_info": false, 00:17:08.048 "zone_management": false, 00:17:08.048 "zone_append": false, 00:17:08.048 "compare": false, 00:17:08.048 "compare_and_write": false, 00:17:08.048 "abort": true, 00:17:08.048 "seek_hole": false, 00:17:08.048 "seek_data": false, 00:17:08.048 "copy": true, 00:17:08.048 "nvme_iov_md": false 00:17:08.048 }, 00:17:08.048 "memory_domains": [ 00:17:08.048 { 00:17:08.048 "dma_device_id": "system", 00:17:08.048 "dma_device_type": 1 00:17:08.048 }, 00:17:08.048 { 00:17:08.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:08.048 "dma_device_type": 2 00:17:08.048 } 00:17:08.048 ], 00:17:08.048 "driver_specific": {} 00:17:08.048 } 00:17:08.048 ] 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.048 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.618 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.618 "name": "Existed_Raid", 00:17:08.618 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:08.618 "strip_size_kb": 0, 00:17:08.618 "state": "online", 00:17:08.618 "raid_level": "raid1", 00:17:08.618 "superblock": true, 00:17:08.618 "num_base_bdevs": 3, 00:17:08.618 "num_base_bdevs_discovered": 3, 00:17:08.618 "num_base_bdevs_operational": 3, 00:17:08.618 "base_bdevs_list": [ 00:17:08.618 { 00:17:08.618 "name": "BaseBdev1", 00:17:08.618 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:08.618 "is_configured": true, 00:17:08.618 "data_offset": 2048, 00:17:08.618 "data_size": 63488 00:17:08.618 }, 00:17:08.618 { 00:17:08.618 "name": "BaseBdev2", 00:17:08.618 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:08.618 "is_configured": true, 00:17:08.618 "data_offset": 2048, 00:17:08.618 "data_size": 63488 00:17:08.618 }, 00:17:08.618 { 00:17:08.618 "name": "BaseBdev3", 00:17:08.618 "uuid": "5e988186-1396-4901-ab40-7a7e3fd884a1", 00:17:08.618 "is_configured": true, 00:17:08.618 "data_offset": 2048, 00:17:08.618 "data_size": 63488 00:17:08.618 } 00:17:08.618 ] 00:17:08.618 }' 00:17:08.618 15:53:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.618 15:53:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:09.188 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:09.448 [2024-07-12 15:53:29.644637] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:09.448 "name": "Existed_Raid", 00:17:09.448 "aliases": [ 00:17:09.448 "6cbf4c92-edc4-44ed-a340-349befd196e5" 00:17:09.448 ], 00:17:09.448 "product_name": "Raid Volume", 00:17:09.448 "block_size": 512, 00:17:09.448 "num_blocks": 63488, 00:17:09.448 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:09.448 "assigned_rate_limits": { 00:17:09.448 "rw_ios_per_sec": 0, 00:17:09.448 "rw_mbytes_per_sec": 0, 00:17:09.448 "r_mbytes_per_sec": 0, 00:17:09.448 "w_mbytes_per_sec": 0 00:17:09.448 }, 00:17:09.448 "claimed": false, 00:17:09.448 "zoned": false, 00:17:09.448 "supported_io_types": { 00:17:09.448 "read": true, 00:17:09.448 "write": true, 00:17:09.448 "unmap": false, 00:17:09.448 "flush": false, 00:17:09.448 "reset": true, 00:17:09.448 "nvme_admin": false, 00:17:09.448 "nvme_io": false, 00:17:09.448 "nvme_io_md": false, 00:17:09.448 "write_zeroes": true, 00:17:09.448 "zcopy": false, 00:17:09.448 "get_zone_info": false, 00:17:09.448 "zone_management": false, 00:17:09.448 "zone_append": false, 00:17:09.448 "compare": false, 00:17:09.448 "compare_and_write": false, 00:17:09.448 "abort": false, 00:17:09.448 "seek_hole": false, 00:17:09.448 "seek_data": false, 00:17:09.448 "copy": false, 00:17:09.448 "nvme_iov_md": false 00:17:09.448 }, 00:17:09.448 "memory_domains": [ 00:17:09.448 { 00:17:09.448 "dma_device_id": "system", 00:17:09.448 "dma_device_type": 1 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.448 "dma_device_type": 2 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "system", 00:17:09.448 "dma_device_type": 1 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.448 "dma_device_type": 2 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "system", 00:17:09.448 "dma_device_type": 1 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.448 "dma_device_type": 2 00:17:09.448 } 00:17:09.448 ], 00:17:09.448 "driver_specific": { 00:17:09.448 "raid": { 00:17:09.448 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:09.448 "strip_size_kb": 0, 00:17:09.448 "state": "online", 00:17:09.448 "raid_level": "raid1", 00:17:09.448 "superblock": true, 00:17:09.448 "num_base_bdevs": 3, 00:17:09.448 "num_base_bdevs_discovered": 3, 00:17:09.448 "num_base_bdevs_operational": 3, 00:17:09.448 "base_bdevs_list": [ 00:17:09.448 { 00:17:09.448 "name": "BaseBdev1", 00:17:09.448 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:09.448 "is_configured": true, 00:17:09.448 "data_offset": 2048, 00:17:09.448 "data_size": 63488 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "name": "BaseBdev2", 00:17:09.448 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:09.448 "is_configured": true, 00:17:09.448 "data_offset": 2048, 00:17:09.448 "data_size": 63488 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "name": "BaseBdev3", 00:17:09.448 "uuid": "5e988186-1396-4901-ab40-7a7e3fd884a1", 00:17:09.448 "is_configured": true, 00:17:09.448 "data_offset": 2048, 00:17:09.448 "data_size": 63488 00:17:09.448 } 00:17:09.448 ] 00:17:09.448 } 00:17:09.448 } 00:17:09.448 }' 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:09.448 BaseBdev2 00:17:09.448 BaseBdev3' 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:09.448 "name": "BaseBdev1", 00:17:09.448 "aliases": [ 00:17:09.448 "487ee9d3-0628-48e4-afab-cd0a3a7aef90" 00:17:09.448 ], 00:17:09.448 "product_name": "Malloc disk", 00:17:09.448 "block_size": 512, 00:17:09.448 "num_blocks": 65536, 00:17:09.448 "uuid": "487ee9d3-0628-48e4-afab-cd0a3a7aef90", 00:17:09.448 "assigned_rate_limits": { 00:17:09.448 "rw_ios_per_sec": 0, 00:17:09.448 "rw_mbytes_per_sec": 0, 00:17:09.448 "r_mbytes_per_sec": 0, 00:17:09.448 "w_mbytes_per_sec": 0 00:17:09.448 }, 00:17:09.448 "claimed": true, 00:17:09.448 "claim_type": "exclusive_write", 00:17:09.448 "zoned": false, 00:17:09.448 "supported_io_types": { 00:17:09.448 "read": true, 00:17:09.448 "write": true, 00:17:09.448 "unmap": true, 00:17:09.448 "flush": true, 00:17:09.448 "reset": true, 00:17:09.448 "nvme_admin": false, 00:17:09.448 "nvme_io": false, 00:17:09.448 "nvme_io_md": false, 00:17:09.448 "write_zeroes": true, 00:17:09.448 "zcopy": true, 00:17:09.448 "get_zone_info": false, 00:17:09.448 "zone_management": false, 00:17:09.448 "zone_append": false, 00:17:09.448 "compare": false, 00:17:09.448 "compare_and_write": false, 00:17:09.448 "abort": true, 00:17:09.448 "seek_hole": false, 00:17:09.448 "seek_data": false, 00:17:09.448 "copy": true, 00:17:09.448 "nvme_iov_md": false 00:17:09.448 }, 00:17:09.448 "memory_domains": [ 00:17:09.448 { 00:17:09.448 "dma_device_id": "system", 00:17:09.448 "dma_device_type": 1 00:17:09.448 }, 00:17:09.448 { 00:17:09.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:09.448 "dma_device_type": 2 00:17:09.448 } 00:17:09.448 ], 00:17:09.448 "driver_specific": {} 00:17:09.448 }' 00:17:09.448 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.708 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:09.708 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:09.708 15:53:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.708 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:09.708 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:09.708 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.708 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:09.967 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:10.226 "name": "BaseBdev2", 00:17:10.226 "aliases": [ 00:17:10.226 "5f33135e-bbd5-4cd9-9811-4c4ef25e6469" 00:17:10.226 ], 00:17:10.226 "product_name": "Malloc disk", 00:17:10.226 "block_size": 512, 00:17:10.226 "num_blocks": 65536, 00:17:10.226 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:10.226 "assigned_rate_limits": { 00:17:10.226 "rw_ios_per_sec": 0, 00:17:10.226 "rw_mbytes_per_sec": 0, 00:17:10.226 "r_mbytes_per_sec": 0, 00:17:10.226 "w_mbytes_per_sec": 0 00:17:10.226 }, 00:17:10.226 "claimed": true, 00:17:10.226 "claim_type": "exclusive_write", 00:17:10.226 "zoned": false, 00:17:10.226 "supported_io_types": { 00:17:10.226 "read": true, 00:17:10.226 "write": true, 00:17:10.226 "unmap": true, 00:17:10.226 "flush": true, 00:17:10.226 "reset": true, 00:17:10.226 "nvme_admin": false, 00:17:10.226 "nvme_io": false, 00:17:10.226 "nvme_io_md": false, 00:17:10.226 "write_zeroes": true, 00:17:10.226 "zcopy": true, 00:17:10.226 "get_zone_info": false, 00:17:10.226 "zone_management": false, 00:17:10.226 "zone_append": false, 00:17:10.226 "compare": false, 00:17:10.226 "compare_and_write": false, 00:17:10.226 "abort": true, 00:17:10.226 "seek_hole": false, 00:17:10.226 "seek_data": false, 00:17:10.226 "copy": true, 00:17:10.226 "nvme_iov_md": false 00:17:10.226 }, 00:17:10.226 "memory_domains": [ 00:17:10.226 { 00:17:10.226 "dma_device_id": "system", 00:17:10.226 "dma_device_type": 1 00:17:10.226 }, 00:17:10.226 { 00:17:10.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.226 "dma_device_type": 2 00:17:10.226 } 00:17:10.226 ], 00:17:10.226 "driver_specific": {} 00:17:10.226 }' 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.226 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:10.485 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:10.746 15:53:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:10.746 "name": "BaseBdev3", 00:17:10.746 "aliases": [ 00:17:10.746 "5e988186-1396-4901-ab40-7a7e3fd884a1" 00:17:10.746 ], 00:17:10.746 "product_name": "Malloc disk", 00:17:10.746 "block_size": 512, 00:17:10.746 "num_blocks": 65536, 00:17:10.746 "uuid": "5e988186-1396-4901-ab40-7a7e3fd884a1", 00:17:10.746 "assigned_rate_limits": { 00:17:10.746 "rw_ios_per_sec": 0, 00:17:10.746 "rw_mbytes_per_sec": 0, 00:17:10.746 "r_mbytes_per_sec": 0, 00:17:10.746 "w_mbytes_per_sec": 0 00:17:10.746 }, 00:17:10.746 "claimed": true, 00:17:10.746 "claim_type": "exclusive_write", 00:17:10.746 "zoned": false, 00:17:10.746 "supported_io_types": { 00:17:10.746 "read": true, 00:17:10.746 "write": true, 00:17:10.746 "unmap": true, 00:17:10.746 "flush": true, 00:17:10.746 "reset": true, 00:17:10.746 "nvme_admin": false, 00:17:10.746 "nvme_io": false, 00:17:10.746 "nvme_io_md": false, 00:17:10.746 "write_zeroes": true, 00:17:10.746 "zcopy": true, 00:17:10.746 "get_zone_info": false, 00:17:10.746 "zone_management": false, 00:17:10.746 "zone_append": false, 00:17:10.746 "compare": false, 00:17:10.746 "compare_and_write": false, 00:17:10.746 "abort": true, 00:17:10.746 "seek_hole": false, 00:17:10.746 "seek_data": false, 00:17:10.746 "copy": true, 00:17:10.746 "nvme_iov_md": false 00:17:10.746 }, 00:17:10.746 "memory_domains": [ 00:17:10.746 { 00:17:10.746 "dma_device_id": "system", 00:17:10.746 "dma_device_type": 1 00:17:10.746 }, 00:17:10.746 { 00:17:10.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.747 "dma_device_type": 2 00:17:10.747 } 00:17:10.747 ], 00:17:10.747 "driver_specific": {} 00:17:10.747 }' 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:10.747 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:11.006 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:11.266 [2024-07-12 15:53:31.497121] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.266 "name": "Existed_Raid", 00:17:11.266 "uuid": "6cbf4c92-edc4-44ed-a340-349befd196e5", 00:17:11.266 "strip_size_kb": 0, 00:17:11.266 "state": "online", 00:17:11.266 "raid_level": "raid1", 00:17:11.266 "superblock": true, 00:17:11.266 "num_base_bdevs": 3, 00:17:11.266 "num_base_bdevs_discovered": 2, 00:17:11.266 "num_base_bdevs_operational": 2, 00:17:11.266 "base_bdevs_list": [ 00:17:11.266 { 00:17:11.266 "name": null, 00:17:11.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:11.266 "is_configured": false, 00:17:11.266 "data_offset": 2048, 00:17:11.266 "data_size": 63488 00:17:11.266 }, 00:17:11.266 { 00:17:11.266 "name": "BaseBdev2", 00:17:11.266 "uuid": "5f33135e-bbd5-4cd9-9811-4c4ef25e6469", 00:17:11.266 "is_configured": true, 00:17:11.266 "data_offset": 2048, 00:17:11.266 "data_size": 63488 00:17:11.266 }, 00:17:11.266 { 00:17:11.266 "name": "BaseBdev3", 00:17:11.266 "uuid": "5e988186-1396-4901-ab40-7a7e3fd884a1", 00:17:11.266 "is_configured": true, 00:17:11.266 "data_offset": 2048, 00:17:11.266 "data_size": 63488 00:17:11.266 } 00:17:11.266 ] 00:17:11.266 }' 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.266 15:53:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:11.836 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:11.836 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:11.836 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.836 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:12.096 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:12.096 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:12.096 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:12.355 [2024-07-12 15:53:32.635989] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:12.355 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:12.355 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:12.355 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.355 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:12.615 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:12.615 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:12.615 15:53:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:12.615 [2024-07-12 15:53:33.026850] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:12.615 [2024-07-12 15:53:33.026914] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:12.615 [2024-07-12 15:53:33.032918] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:12.615 [2024-07-12 15:53:33.032942] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:12.615 [2024-07-12 15:53:33.032948] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x201c280 name Existed_Raid, state offline 00:17:12.615 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:12.615 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:12.615 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.615 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:12.874 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:13.133 BaseBdev2 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.133 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.393 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:13.393 [ 00:17:13.393 { 00:17:13.393 "name": "BaseBdev2", 00:17:13.393 "aliases": [ 00:17:13.393 "56275e7f-a481-402f-b632-c01cd45a51a0" 00:17:13.393 ], 00:17:13.393 "product_name": "Malloc disk", 00:17:13.393 "block_size": 512, 00:17:13.393 "num_blocks": 65536, 00:17:13.393 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:13.393 "assigned_rate_limits": { 00:17:13.393 "rw_ios_per_sec": 0, 00:17:13.393 "rw_mbytes_per_sec": 0, 00:17:13.393 "r_mbytes_per_sec": 0, 00:17:13.393 "w_mbytes_per_sec": 0 00:17:13.393 }, 00:17:13.393 "claimed": false, 00:17:13.393 "zoned": false, 00:17:13.393 "supported_io_types": { 00:17:13.393 "read": true, 00:17:13.393 "write": true, 00:17:13.393 "unmap": true, 00:17:13.393 "flush": true, 00:17:13.393 "reset": true, 00:17:13.393 "nvme_admin": false, 00:17:13.393 "nvme_io": false, 00:17:13.393 "nvme_io_md": false, 00:17:13.393 "write_zeroes": true, 00:17:13.393 "zcopy": true, 00:17:13.393 "get_zone_info": false, 00:17:13.393 "zone_management": false, 00:17:13.393 "zone_append": false, 00:17:13.393 "compare": false, 00:17:13.393 "compare_and_write": false, 00:17:13.393 "abort": true, 00:17:13.393 "seek_hole": false, 00:17:13.393 "seek_data": false, 00:17:13.393 "copy": true, 00:17:13.393 "nvme_iov_md": false 00:17:13.393 }, 00:17:13.393 "memory_domains": [ 00:17:13.393 { 00:17:13.393 "dma_device_id": "system", 00:17:13.393 "dma_device_type": 1 00:17:13.393 }, 00:17:13.393 { 00:17:13.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.393 "dma_device_type": 2 00:17:13.393 } 00:17:13.393 ], 00:17:13.393 "driver_specific": {} 00:17:13.393 } 00:17:13.393 ] 00:17:13.393 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:13.393 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:13.393 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:13.393 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:13.653 BaseBdev3 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:13.653 15:53:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:13.912 15:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:14.172 [ 00:17:14.172 { 00:17:14.172 "name": "BaseBdev3", 00:17:14.172 "aliases": [ 00:17:14.172 "6192dec9-900c-4912-9d44-419cd964bc3b" 00:17:14.172 ], 00:17:14.172 "product_name": "Malloc disk", 00:17:14.172 "block_size": 512, 00:17:14.172 "num_blocks": 65536, 00:17:14.172 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:14.172 "assigned_rate_limits": { 00:17:14.172 "rw_ios_per_sec": 0, 00:17:14.172 "rw_mbytes_per_sec": 0, 00:17:14.172 "r_mbytes_per_sec": 0, 00:17:14.172 "w_mbytes_per_sec": 0 00:17:14.172 }, 00:17:14.172 "claimed": false, 00:17:14.172 "zoned": false, 00:17:14.172 "supported_io_types": { 00:17:14.172 "read": true, 00:17:14.172 "write": true, 00:17:14.172 "unmap": true, 00:17:14.172 "flush": true, 00:17:14.172 "reset": true, 00:17:14.172 "nvme_admin": false, 00:17:14.172 "nvme_io": false, 00:17:14.172 "nvme_io_md": false, 00:17:14.172 "write_zeroes": true, 00:17:14.172 "zcopy": true, 00:17:14.172 "get_zone_info": false, 00:17:14.172 "zone_management": false, 00:17:14.172 "zone_append": false, 00:17:14.172 "compare": false, 00:17:14.172 "compare_and_write": false, 00:17:14.172 "abort": true, 00:17:14.172 "seek_hole": false, 00:17:14.172 "seek_data": false, 00:17:14.172 "copy": true, 00:17:14.172 "nvme_iov_md": false 00:17:14.172 }, 00:17:14.172 "memory_domains": [ 00:17:14.172 { 00:17:14.172 "dma_device_id": "system", 00:17:14.172 "dma_device_type": 1 00:17:14.172 }, 00:17:14.172 { 00:17:14.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.172 "dma_device_type": 2 00:17:14.172 } 00:17:14.172 ], 00:17:14.172 "driver_specific": {} 00:17:14.172 } 00:17:14.172 ] 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:14.172 [2024-07-12 15:53:34.546672] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:14.172 [2024-07-12 15:53:34.546699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:14.172 [2024-07-12 15:53:34.546716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:14.172 [2024-07-12 15:53:34.547732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.172 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:14.433 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:14.433 "name": "Existed_Raid", 00:17:14.433 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:14.433 "strip_size_kb": 0, 00:17:14.433 "state": "configuring", 00:17:14.433 "raid_level": "raid1", 00:17:14.433 "superblock": true, 00:17:14.433 "num_base_bdevs": 3, 00:17:14.433 "num_base_bdevs_discovered": 2, 00:17:14.433 "num_base_bdevs_operational": 3, 00:17:14.433 "base_bdevs_list": [ 00:17:14.433 { 00:17:14.433 "name": "BaseBdev1", 00:17:14.433 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:14.433 "is_configured": false, 00:17:14.433 "data_offset": 0, 00:17:14.433 "data_size": 0 00:17:14.433 }, 00:17:14.433 { 00:17:14.433 "name": "BaseBdev2", 00:17:14.433 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:14.433 "is_configured": true, 00:17:14.433 "data_offset": 2048, 00:17:14.433 "data_size": 63488 00:17:14.433 }, 00:17:14.433 { 00:17:14.433 "name": "BaseBdev3", 00:17:14.433 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:14.433 "is_configured": true, 00:17:14.433 "data_offset": 2048, 00:17:14.433 "data_size": 63488 00:17:14.433 } 00:17:14.433 ] 00:17:14.433 }' 00:17:14.433 15:53:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:14.433 15:53:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.013 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:15.330 [2024-07-12 15:53:35.456951] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.330 "name": "Existed_Raid", 00:17:15.330 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:15.330 "strip_size_kb": 0, 00:17:15.330 "state": "configuring", 00:17:15.330 "raid_level": "raid1", 00:17:15.330 "superblock": true, 00:17:15.330 "num_base_bdevs": 3, 00:17:15.330 "num_base_bdevs_discovered": 1, 00:17:15.330 "num_base_bdevs_operational": 3, 00:17:15.330 "base_bdevs_list": [ 00:17:15.330 { 00:17:15.330 "name": "BaseBdev1", 00:17:15.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:15.330 "is_configured": false, 00:17:15.330 "data_offset": 0, 00:17:15.330 "data_size": 0 00:17:15.330 }, 00:17:15.330 { 00:17:15.330 "name": null, 00:17:15.330 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:15.330 "is_configured": false, 00:17:15.330 "data_offset": 2048, 00:17:15.330 "data_size": 63488 00:17:15.330 }, 00:17:15.330 { 00:17:15.330 "name": "BaseBdev3", 00:17:15.330 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:15.330 "is_configured": true, 00:17:15.330 "data_offset": 2048, 00:17:15.330 "data_size": 63488 00:17:15.330 } 00:17:15.330 ] 00:17:15.330 }' 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.330 15:53:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.900 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.900 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:16.159 [2024-07-12 15:53:36.560628] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:16.159 BaseBdev1 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:16.159 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:16.419 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:16.680 [ 00:17:16.680 { 00:17:16.680 "name": "BaseBdev1", 00:17:16.680 "aliases": [ 00:17:16.680 "7da162a6-5933-4b43-b8bd-149bdfdcb774" 00:17:16.680 ], 00:17:16.680 "product_name": "Malloc disk", 00:17:16.680 "block_size": 512, 00:17:16.680 "num_blocks": 65536, 00:17:16.680 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:16.680 "assigned_rate_limits": { 00:17:16.680 "rw_ios_per_sec": 0, 00:17:16.680 "rw_mbytes_per_sec": 0, 00:17:16.680 "r_mbytes_per_sec": 0, 00:17:16.680 "w_mbytes_per_sec": 0 00:17:16.680 }, 00:17:16.680 "claimed": true, 00:17:16.680 "claim_type": "exclusive_write", 00:17:16.680 "zoned": false, 00:17:16.680 "supported_io_types": { 00:17:16.680 "read": true, 00:17:16.680 "write": true, 00:17:16.680 "unmap": true, 00:17:16.680 "flush": true, 00:17:16.680 "reset": true, 00:17:16.680 "nvme_admin": false, 00:17:16.680 "nvme_io": false, 00:17:16.680 "nvme_io_md": false, 00:17:16.680 "write_zeroes": true, 00:17:16.680 "zcopy": true, 00:17:16.680 "get_zone_info": false, 00:17:16.680 "zone_management": false, 00:17:16.680 "zone_append": false, 00:17:16.680 "compare": false, 00:17:16.680 "compare_and_write": false, 00:17:16.680 "abort": true, 00:17:16.680 "seek_hole": false, 00:17:16.680 "seek_data": false, 00:17:16.680 "copy": true, 00:17:16.680 "nvme_iov_md": false 00:17:16.680 }, 00:17:16.680 "memory_domains": [ 00:17:16.680 { 00:17:16.680 "dma_device_id": "system", 00:17:16.680 "dma_device_type": 1 00:17:16.680 }, 00:17:16.680 { 00:17:16.680 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:16.680 "dma_device_type": 2 00:17:16.680 } 00:17:16.680 ], 00:17:16.680 "driver_specific": {} 00:17:16.680 } 00:17:16.680 ] 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.680 15:53:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.680 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.680 "name": "Existed_Raid", 00:17:16.680 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:16.680 "strip_size_kb": 0, 00:17:16.680 "state": "configuring", 00:17:16.680 "raid_level": "raid1", 00:17:16.680 "superblock": true, 00:17:16.680 "num_base_bdevs": 3, 00:17:16.680 "num_base_bdevs_discovered": 2, 00:17:16.680 "num_base_bdevs_operational": 3, 00:17:16.680 "base_bdevs_list": [ 00:17:16.680 { 00:17:16.680 "name": "BaseBdev1", 00:17:16.680 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:16.680 "is_configured": true, 00:17:16.680 "data_offset": 2048, 00:17:16.680 "data_size": 63488 00:17:16.680 }, 00:17:16.680 { 00:17:16.680 "name": null, 00:17:16.680 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:16.680 "is_configured": false, 00:17:16.680 "data_offset": 2048, 00:17:16.680 "data_size": 63488 00:17:16.680 }, 00:17:16.680 { 00:17:16.680 "name": "BaseBdev3", 00:17:16.680 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:16.680 "is_configured": true, 00:17:16.680 "data_offset": 2048, 00:17:16.680 "data_size": 63488 00:17:16.680 } 00:17:16.680 ] 00:17:16.680 }' 00:17:16.680 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.680 15:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.249 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:17.250 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.509 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:17.509 15:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:17.769 [2024-07-12 15:53:38.024343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.769 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.029 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.029 "name": "Existed_Raid", 00:17:18.029 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:18.029 "strip_size_kb": 0, 00:17:18.029 "state": "configuring", 00:17:18.029 "raid_level": "raid1", 00:17:18.029 "superblock": true, 00:17:18.029 "num_base_bdevs": 3, 00:17:18.029 "num_base_bdevs_discovered": 1, 00:17:18.029 "num_base_bdevs_operational": 3, 00:17:18.029 "base_bdevs_list": [ 00:17:18.029 { 00:17:18.029 "name": "BaseBdev1", 00:17:18.029 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:18.029 "is_configured": true, 00:17:18.029 "data_offset": 2048, 00:17:18.029 "data_size": 63488 00:17:18.029 }, 00:17:18.029 { 00:17:18.029 "name": null, 00:17:18.029 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:18.029 "is_configured": false, 00:17:18.029 "data_offset": 2048, 00:17:18.029 "data_size": 63488 00:17:18.029 }, 00:17:18.029 { 00:17:18.029 "name": null, 00:17:18.029 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:18.029 "is_configured": false, 00:17:18.029 "data_offset": 2048, 00:17:18.029 "data_size": 63488 00:17:18.029 } 00:17:18.029 ] 00:17:18.029 }' 00:17:18.029 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.029 15:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.598 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.598 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:18.598 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:18.598 15:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:18.858 [2024-07-12 15:53:39.135178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.858 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.118 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.118 "name": "Existed_Raid", 00:17:19.118 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:19.118 "strip_size_kb": 0, 00:17:19.118 "state": "configuring", 00:17:19.118 "raid_level": "raid1", 00:17:19.118 "superblock": true, 00:17:19.118 "num_base_bdevs": 3, 00:17:19.118 "num_base_bdevs_discovered": 2, 00:17:19.118 "num_base_bdevs_operational": 3, 00:17:19.118 "base_bdevs_list": [ 00:17:19.118 { 00:17:19.118 "name": "BaseBdev1", 00:17:19.118 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:19.118 "is_configured": true, 00:17:19.118 "data_offset": 2048, 00:17:19.118 "data_size": 63488 00:17:19.118 }, 00:17:19.118 { 00:17:19.118 "name": null, 00:17:19.118 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:19.118 "is_configured": false, 00:17:19.118 "data_offset": 2048, 00:17:19.118 "data_size": 63488 00:17:19.118 }, 00:17:19.118 { 00:17:19.118 "name": "BaseBdev3", 00:17:19.118 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:19.118 "is_configured": true, 00:17:19.118 "data_offset": 2048, 00:17:19.118 "data_size": 63488 00:17:19.118 } 00:17:19.118 ] 00:17:19.118 }' 00:17:19.118 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.118 15:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.687 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.687 15:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:19.687 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:19.687 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:19.947 [2024-07-12 15:53:40.254019] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.947 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.948 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.948 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.948 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.207 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.207 "name": "Existed_Raid", 00:17:20.207 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:20.207 "strip_size_kb": 0, 00:17:20.207 "state": "configuring", 00:17:20.207 "raid_level": "raid1", 00:17:20.207 "superblock": true, 00:17:20.207 "num_base_bdevs": 3, 00:17:20.207 "num_base_bdevs_discovered": 1, 00:17:20.207 "num_base_bdevs_operational": 3, 00:17:20.207 "base_bdevs_list": [ 00:17:20.207 { 00:17:20.207 "name": null, 00:17:20.207 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:20.207 "is_configured": false, 00:17:20.207 "data_offset": 2048, 00:17:20.207 "data_size": 63488 00:17:20.207 }, 00:17:20.207 { 00:17:20.207 "name": null, 00:17:20.207 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:20.207 "is_configured": false, 00:17:20.207 "data_offset": 2048, 00:17:20.207 "data_size": 63488 00:17:20.207 }, 00:17:20.207 { 00:17:20.207 "name": "BaseBdev3", 00:17:20.207 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:20.207 "is_configured": true, 00:17:20.207 "data_offset": 2048, 00:17:20.207 "data_size": 63488 00:17:20.207 } 00:17:20.207 ] 00:17:20.207 }' 00:17:20.207 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.207 15:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:20.776 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.776 15:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:20.776 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:20.776 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:21.035 [2024-07-12 15:53:41.358625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.035 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:21.295 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.295 "name": "Existed_Raid", 00:17:21.295 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:21.295 "strip_size_kb": 0, 00:17:21.295 "state": "configuring", 00:17:21.295 "raid_level": "raid1", 00:17:21.295 "superblock": true, 00:17:21.295 "num_base_bdevs": 3, 00:17:21.295 "num_base_bdevs_discovered": 2, 00:17:21.295 "num_base_bdevs_operational": 3, 00:17:21.295 "base_bdevs_list": [ 00:17:21.295 { 00:17:21.295 "name": null, 00:17:21.295 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:21.295 "is_configured": false, 00:17:21.295 "data_offset": 2048, 00:17:21.295 "data_size": 63488 00:17:21.295 }, 00:17:21.295 { 00:17:21.295 "name": "BaseBdev2", 00:17:21.295 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:21.295 "is_configured": true, 00:17:21.295 "data_offset": 2048, 00:17:21.296 "data_size": 63488 00:17:21.296 }, 00:17:21.296 { 00:17:21.296 "name": "BaseBdev3", 00:17:21.296 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:21.296 "is_configured": true, 00:17:21.296 "data_offset": 2048, 00:17:21.296 "data_size": 63488 00:17:21.296 } 00:17:21.296 ] 00:17:21.296 }' 00:17:21.296 15:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.296 15:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.864 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.864 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:22.123 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:22.123 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.123 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:22.123 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7da162a6-5933-4b43-b8bd-149bdfdcb774 00:17:22.382 [2024-07-12 15:53:42.690999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:22.382 [2024-07-12 15:53:42.691106] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x201e8b0 00:17:22.382 [2024-07-12 15:53:42.691113] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:22.382 [2024-07-12 15:53:42.691254] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ce370 00:17:22.382 [2024-07-12 15:53:42.691343] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x201e8b0 00:17:22.382 [2024-07-12 15:53:42.691348] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x201e8b0 00:17:22.382 [2024-07-12 15:53:42.691422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.382 NewBaseBdev 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:22.382 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.641 15:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:22.641 [ 00:17:22.641 { 00:17:22.641 "name": "NewBaseBdev", 00:17:22.641 "aliases": [ 00:17:22.641 "7da162a6-5933-4b43-b8bd-149bdfdcb774" 00:17:22.641 ], 00:17:22.641 "product_name": "Malloc disk", 00:17:22.641 "block_size": 512, 00:17:22.641 "num_blocks": 65536, 00:17:22.641 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:22.641 "assigned_rate_limits": { 00:17:22.641 "rw_ios_per_sec": 0, 00:17:22.641 "rw_mbytes_per_sec": 0, 00:17:22.641 "r_mbytes_per_sec": 0, 00:17:22.641 "w_mbytes_per_sec": 0 00:17:22.641 }, 00:17:22.641 "claimed": true, 00:17:22.641 "claim_type": "exclusive_write", 00:17:22.641 "zoned": false, 00:17:22.641 "supported_io_types": { 00:17:22.641 "read": true, 00:17:22.641 "write": true, 00:17:22.641 "unmap": true, 00:17:22.641 "flush": true, 00:17:22.641 "reset": true, 00:17:22.641 "nvme_admin": false, 00:17:22.641 "nvme_io": false, 00:17:22.641 "nvme_io_md": false, 00:17:22.641 "write_zeroes": true, 00:17:22.641 "zcopy": true, 00:17:22.641 "get_zone_info": false, 00:17:22.641 "zone_management": false, 00:17:22.641 "zone_append": false, 00:17:22.641 "compare": false, 00:17:22.641 "compare_and_write": false, 00:17:22.641 "abort": true, 00:17:22.641 "seek_hole": false, 00:17:22.641 "seek_data": false, 00:17:22.641 "copy": true, 00:17:22.641 "nvme_iov_md": false 00:17:22.641 }, 00:17:22.641 "memory_domains": [ 00:17:22.641 { 00:17:22.641 "dma_device_id": "system", 00:17:22.641 "dma_device_type": 1 00:17:22.641 }, 00:17:22.641 { 00:17:22.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.641 "dma_device_type": 2 00:17:22.641 } 00:17:22.641 ], 00:17:22.641 "driver_specific": {} 00:17:22.641 } 00:17:22.641 ] 00:17:22.641 15:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:22.641 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:22.641 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.641 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:22.641 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.642 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.901 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.901 "name": "Existed_Raid", 00:17:22.901 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:22.901 "strip_size_kb": 0, 00:17:22.901 "state": "online", 00:17:22.901 "raid_level": "raid1", 00:17:22.901 "superblock": true, 00:17:22.901 "num_base_bdevs": 3, 00:17:22.901 "num_base_bdevs_discovered": 3, 00:17:22.901 "num_base_bdevs_operational": 3, 00:17:22.901 "base_bdevs_list": [ 00:17:22.901 { 00:17:22.901 "name": "NewBaseBdev", 00:17:22.901 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:22.901 "is_configured": true, 00:17:22.901 "data_offset": 2048, 00:17:22.901 "data_size": 63488 00:17:22.901 }, 00:17:22.901 { 00:17:22.901 "name": "BaseBdev2", 00:17:22.901 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:22.901 "is_configured": true, 00:17:22.901 "data_offset": 2048, 00:17:22.901 "data_size": 63488 00:17:22.901 }, 00:17:22.901 { 00:17:22.901 "name": "BaseBdev3", 00:17:22.901 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:22.901 "is_configured": true, 00:17:22.901 "data_offset": 2048, 00:17:22.901 "data_size": 63488 00:17:22.901 } 00:17:22.901 ] 00:17:22.901 }' 00:17:22.901 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.901 15:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:23.468 15:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:23.728 [2024-07-12 15:53:44.006559] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:23.728 "name": "Existed_Raid", 00:17:23.728 "aliases": [ 00:17:23.728 "710c7da3-3cc7-4261-9e37-6dd06057aff2" 00:17:23.728 ], 00:17:23.728 "product_name": "Raid Volume", 00:17:23.728 "block_size": 512, 00:17:23.728 "num_blocks": 63488, 00:17:23.728 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:23.728 "assigned_rate_limits": { 00:17:23.728 "rw_ios_per_sec": 0, 00:17:23.728 "rw_mbytes_per_sec": 0, 00:17:23.728 "r_mbytes_per_sec": 0, 00:17:23.728 "w_mbytes_per_sec": 0 00:17:23.728 }, 00:17:23.728 "claimed": false, 00:17:23.728 "zoned": false, 00:17:23.728 "supported_io_types": { 00:17:23.728 "read": true, 00:17:23.728 "write": true, 00:17:23.728 "unmap": false, 00:17:23.728 "flush": false, 00:17:23.728 "reset": true, 00:17:23.728 "nvme_admin": false, 00:17:23.728 "nvme_io": false, 00:17:23.728 "nvme_io_md": false, 00:17:23.728 "write_zeroes": true, 00:17:23.728 "zcopy": false, 00:17:23.728 "get_zone_info": false, 00:17:23.728 "zone_management": false, 00:17:23.728 "zone_append": false, 00:17:23.728 "compare": false, 00:17:23.728 "compare_and_write": false, 00:17:23.728 "abort": false, 00:17:23.728 "seek_hole": false, 00:17:23.728 "seek_data": false, 00:17:23.728 "copy": false, 00:17:23.728 "nvme_iov_md": false 00:17:23.728 }, 00:17:23.728 "memory_domains": [ 00:17:23.728 { 00:17:23.728 "dma_device_id": "system", 00:17:23.728 "dma_device_type": 1 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.728 "dma_device_type": 2 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "dma_device_id": "system", 00:17:23.728 "dma_device_type": 1 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.728 "dma_device_type": 2 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "dma_device_id": "system", 00:17:23.728 "dma_device_type": 1 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.728 "dma_device_type": 2 00:17:23.728 } 00:17:23.728 ], 00:17:23.728 "driver_specific": { 00:17:23.728 "raid": { 00:17:23.728 "uuid": "710c7da3-3cc7-4261-9e37-6dd06057aff2", 00:17:23.728 "strip_size_kb": 0, 00:17:23.728 "state": "online", 00:17:23.728 "raid_level": "raid1", 00:17:23.728 "superblock": true, 00:17:23.728 "num_base_bdevs": 3, 00:17:23.728 "num_base_bdevs_discovered": 3, 00:17:23.728 "num_base_bdevs_operational": 3, 00:17:23.728 "base_bdevs_list": [ 00:17:23.728 { 00:17:23.728 "name": "NewBaseBdev", 00:17:23.728 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:23.728 "is_configured": true, 00:17:23.728 "data_offset": 2048, 00:17:23.728 "data_size": 63488 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "name": "BaseBdev2", 00:17:23.728 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:23.728 "is_configured": true, 00:17:23.728 "data_offset": 2048, 00:17:23.728 "data_size": 63488 00:17:23.728 }, 00:17:23.728 { 00:17:23.728 "name": "BaseBdev3", 00:17:23.728 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:23.728 "is_configured": true, 00:17:23.728 "data_offset": 2048, 00:17:23.728 "data_size": 63488 00:17:23.728 } 00:17:23.728 ] 00:17:23.728 } 00:17:23.728 } 00:17:23.728 }' 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:23.728 BaseBdev2 00:17:23.728 BaseBdev3' 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:23.728 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:23.988 "name": "NewBaseBdev", 00:17:23.988 "aliases": [ 00:17:23.988 "7da162a6-5933-4b43-b8bd-149bdfdcb774" 00:17:23.988 ], 00:17:23.988 "product_name": "Malloc disk", 00:17:23.988 "block_size": 512, 00:17:23.988 "num_blocks": 65536, 00:17:23.988 "uuid": "7da162a6-5933-4b43-b8bd-149bdfdcb774", 00:17:23.988 "assigned_rate_limits": { 00:17:23.988 "rw_ios_per_sec": 0, 00:17:23.988 "rw_mbytes_per_sec": 0, 00:17:23.988 "r_mbytes_per_sec": 0, 00:17:23.988 "w_mbytes_per_sec": 0 00:17:23.988 }, 00:17:23.988 "claimed": true, 00:17:23.988 "claim_type": "exclusive_write", 00:17:23.988 "zoned": false, 00:17:23.988 "supported_io_types": { 00:17:23.988 "read": true, 00:17:23.988 "write": true, 00:17:23.988 "unmap": true, 00:17:23.988 "flush": true, 00:17:23.988 "reset": true, 00:17:23.988 "nvme_admin": false, 00:17:23.988 "nvme_io": false, 00:17:23.988 "nvme_io_md": false, 00:17:23.988 "write_zeroes": true, 00:17:23.988 "zcopy": true, 00:17:23.988 "get_zone_info": false, 00:17:23.988 "zone_management": false, 00:17:23.988 "zone_append": false, 00:17:23.988 "compare": false, 00:17:23.988 "compare_and_write": false, 00:17:23.988 "abort": true, 00:17:23.988 "seek_hole": false, 00:17:23.988 "seek_data": false, 00:17:23.988 "copy": true, 00:17:23.988 "nvme_iov_md": false 00:17:23.988 }, 00:17:23.988 "memory_domains": [ 00:17:23.988 { 00:17:23.988 "dma_device_id": "system", 00:17:23.988 "dma_device_type": 1 00:17:23.988 }, 00:17:23.988 { 00:17:23.988 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.988 "dma_device_type": 2 00:17:23.988 } 00:17:23.988 ], 00:17:23.988 "driver_specific": {} 00:17:23.988 }' 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:23.988 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:24.247 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.506 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.506 "name": "BaseBdev2", 00:17:24.506 "aliases": [ 00:17:24.506 "56275e7f-a481-402f-b632-c01cd45a51a0" 00:17:24.506 ], 00:17:24.506 "product_name": "Malloc disk", 00:17:24.506 "block_size": 512, 00:17:24.506 "num_blocks": 65536, 00:17:24.506 "uuid": "56275e7f-a481-402f-b632-c01cd45a51a0", 00:17:24.506 "assigned_rate_limits": { 00:17:24.506 "rw_ios_per_sec": 0, 00:17:24.506 "rw_mbytes_per_sec": 0, 00:17:24.506 "r_mbytes_per_sec": 0, 00:17:24.506 "w_mbytes_per_sec": 0 00:17:24.507 }, 00:17:24.507 "claimed": true, 00:17:24.507 "claim_type": "exclusive_write", 00:17:24.507 "zoned": false, 00:17:24.507 "supported_io_types": { 00:17:24.507 "read": true, 00:17:24.507 "write": true, 00:17:24.507 "unmap": true, 00:17:24.507 "flush": true, 00:17:24.507 "reset": true, 00:17:24.507 "nvme_admin": false, 00:17:24.507 "nvme_io": false, 00:17:24.507 "nvme_io_md": false, 00:17:24.507 "write_zeroes": true, 00:17:24.507 "zcopy": true, 00:17:24.507 "get_zone_info": false, 00:17:24.507 "zone_management": false, 00:17:24.507 "zone_append": false, 00:17:24.507 "compare": false, 00:17:24.507 "compare_and_write": false, 00:17:24.507 "abort": true, 00:17:24.507 "seek_hole": false, 00:17:24.507 "seek_data": false, 00:17:24.507 "copy": true, 00:17:24.507 "nvme_iov_md": false 00:17:24.507 }, 00:17:24.507 "memory_domains": [ 00:17:24.507 { 00:17:24.507 "dma_device_id": "system", 00:17:24.507 "dma_device_type": 1 00:17:24.507 }, 00:17:24.507 { 00:17:24.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.507 "dma_device_type": 2 00:17:24.507 } 00:17:24.507 ], 00:17:24.507 "driver_specific": {} 00:17:24.507 }' 00:17:24.507 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.507 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.507 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.507 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.507 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.767 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.767 15:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:24.767 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.027 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.027 "name": "BaseBdev3", 00:17:25.027 "aliases": [ 00:17:25.027 "6192dec9-900c-4912-9d44-419cd964bc3b" 00:17:25.027 ], 00:17:25.027 "product_name": "Malloc disk", 00:17:25.027 "block_size": 512, 00:17:25.027 "num_blocks": 65536, 00:17:25.027 "uuid": "6192dec9-900c-4912-9d44-419cd964bc3b", 00:17:25.027 "assigned_rate_limits": { 00:17:25.027 "rw_ios_per_sec": 0, 00:17:25.027 "rw_mbytes_per_sec": 0, 00:17:25.027 "r_mbytes_per_sec": 0, 00:17:25.027 "w_mbytes_per_sec": 0 00:17:25.027 }, 00:17:25.027 "claimed": true, 00:17:25.027 "claim_type": "exclusive_write", 00:17:25.027 "zoned": false, 00:17:25.027 "supported_io_types": { 00:17:25.027 "read": true, 00:17:25.027 "write": true, 00:17:25.027 "unmap": true, 00:17:25.027 "flush": true, 00:17:25.027 "reset": true, 00:17:25.027 "nvme_admin": false, 00:17:25.027 "nvme_io": false, 00:17:25.027 "nvme_io_md": false, 00:17:25.027 "write_zeroes": true, 00:17:25.027 "zcopy": true, 00:17:25.027 "get_zone_info": false, 00:17:25.027 "zone_management": false, 00:17:25.027 "zone_append": false, 00:17:25.027 "compare": false, 00:17:25.027 "compare_and_write": false, 00:17:25.027 "abort": true, 00:17:25.027 "seek_hole": false, 00:17:25.027 "seek_data": false, 00:17:25.027 "copy": true, 00:17:25.027 "nvme_iov_md": false 00:17:25.027 }, 00:17:25.027 "memory_domains": [ 00:17:25.027 { 00:17:25.027 "dma_device_id": "system", 00:17:25.027 "dma_device_type": 1 00:17:25.027 }, 00:17:25.027 { 00:17:25.027 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.027 "dma_device_type": 2 00:17:25.027 } 00:17:25.027 ], 00:17:25.027 "driver_specific": {} 00:17:25.027 }' 00:17:25.027 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.027 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.027 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.027 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.286 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:25.546 [2024-07-12 15:53:45.915263] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:25.546 [2024-07-12 15:53:45.915294] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:25.546 [2024-07-12 15:53:45.915344] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:25.546 [2024-07-12 15:53:45.915570] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:25.546 [2024-07-12 15:53:45.915579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x201e8b0 name Existed_Raid, state offline 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2556777 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2556777 ']' 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2556777 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:25.546 15:53:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2556777 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2556777' 00:17:25.806 killing process with pid 2556777 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2556777 00:17:25.806 [2024-07-12 15:53:46.009165] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2556777 00:17:25.806 [2024-07-12 15:53:46.033806] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:25.806 00:17:25.806 real 0m25.380s 00:17:25.806 user 0m47.555s 00:17:25.806 sys 0m3.696s 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:25.806 15:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:25.806 ************************************ 00:17:25.806 END TEST raid_state_function_test_sb 00:17:25.806 ************************************ 00:17:25.806 15:53:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:25.806 15:53:46 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:17:25.806 15:53:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:25.807 15:53:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:25.807 15:53:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:26.065 ************************************ 00:17:26.065 START TEST raid_superblock_test 00:17:26.065 ************************************ 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:26.065 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2561663 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2561663 /var/tmp/spdk-raid.sock 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2561663 ']' 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:26.066 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:26.066 15:53:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.066 [2024-07-12 15:53:46.324287] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:17:26.066 [2024-07-12 15:53:46.324333] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2561663 ] 00:17:26.066 [2024-07-12 15:53:46.411423] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:26.066 [2024-07-12 15:53:46.475064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:26.325 [2024-07-12 15:53:46.516930] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:26.325 [2024-07-12 15:53:46.516953] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:26.894 malloc1 00:17:26.894 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:27.153 [2024-07-12 15:53:47.495046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:27.153 [2024-07-12 15:53:47.495077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.153 [2024-07-12 15:53:47.495088] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e6b50 00:17:27.153 [2024-07-12 15:53:47.495094] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.153 [2024-07-12 15:53:47.496351] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.153 [2024-07-12 15:53:47.496372] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:27.153 pt1 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:27.153 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:27.413 malloc2 00:17:27.413 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:27.413 [2024-07-12 15:53:47.849795] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:27.413 [2024-07-12 15:53:47.849822] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.413 [2024-07-12 15:53:47.849832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e7df0 00:17:27.413 [2024-07-12 15:53:47.849838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.413 [2024-07-12 15:53:47.850992] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.413 [2024-07-12 15:53:47.851011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:27.413 pt2 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:27.672 15:53:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:27.672 malloc3 00:17:27.672 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:27.932 [2024-07-12 15:53:48.232635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:27.932 [2024-07-12 15:53:48.232664] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:27.932 [2024-07-12 15:53:48.232674] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11e7770 00:17:27.932 [2024-07-12 15:53:48.232680] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:27.932 [2024-07-12 15:53:48.233857] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:27.932 [2024-07-12 15:53:48.233875] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:27.932 pt3 00:17:27.932 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:27.932 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:27.932 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:28.191 [2024-07-12 15:53:48.421125] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:28.191 [2024-07-12 15:53:48.422124] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:28.191 [2024-07-12 15:53:48.422165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:28.191 [2024-07-12 15:53:48.422290] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x138ccb0 00:17:28.191 [2024-07-12 15:53:48.422298] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:28.191 [2024-07-12 15:53:48.422442] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e7600 00:17:28.191 [2024-07-12 15:53:48.422551] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138ccb0 00:17:28.191 [2024-07-12 15:53:48.422557] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x138ccb0 00:17:28.191 [2024-07-12 15:53:48.422625] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.191 "name": "raid_bdev1", 00:17:28.191 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:28.191 "strip_size_kb": 0, 00:17:28.191 "state": "online", 00:17:28.191 "raid_level": "raid1", 00:17:28.191 "superblock": true, 00:17:28.191 "num_base_bdevs": 3, 00:17:28.191 "num_base_bdevs_discovered": 3, 00:17:28.191 "num_base_bdevs_operational": 3, 00:17:28.191 "base_bdevs_list": [ 00:17:28.191 { 00:17:28.191 "name": "pt1", 00:17:28.191 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:28.191 "is_configured": true, 00:17:28.191 "data_offset": 2048, 00:17:28.191 "data_size": 63488 00:17:28.191 }, 00:17:28.191 { 00:17:28.191 "name": "pt2", 00:17:28.191 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:28.191 "is_configured": true, 00:17:28.191 "data_offset": 2048, 00:17:28.191 "data_size": 63488 00:17:28.191 }, 00:17:28.191 { 00:17:28.191 "name": "pt3", 00:17:28.191 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:28.191 "is_configured": true, 00:17:28.191 "data_offset": 2048, 00:17:28.191 "data_size": 63488 00:17:28.191 } 00:17:28.191 ] 00:17:28.191 }' 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.191 15:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:28.759 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:29.018 [2024-07-12 15:53:49.323615] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:29.018 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:29.018 "name": "raid_bdev1", 00:17:29.018 "aliases": [ 00:17:29.018 "2273a614-62ae-48cc-bbaa-7b1d643bed10" 00:17:29.018 ], 00:17:29.018 "product_name": "Raid Volume", 00:17:29.018 "block_size": 512, 00:17:29.018 "num_blocks": 63488, 00:17:29.018 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:29.018 "assigned_rate_limits": { 00:17:29.018 "rw_ios_per_sec": 0, 00:17:29.018 "rw_mbytes_per_sec": 0, 00:17:29.019 "r_mbytes_per_sec": 0, 00:17:29.019 "w_mbytes_per_sec": 0 00:17:29.019 }, 00:17:29.019 "claimed": false, 00:17:29.019 "zoned": false, 00:17:29.019 "supported_io_types": { 00:17:29.019 "read": true, 00:17:29.019 "write": true, 00:17:29.019 "unmap": false, 00:17:29.019 "flush": false, 00:17:29.019 "reset": true, 00:17:29.019 "nvme_admin": false, 00:17:29.019 "nvme_io": false, 00:17:29.019 "nvme_io_md": false, 00:17:29.019 "write_zeroes": true, 00:17:29.019 "zcopy": false, 00:17:29.019 "get_zone_info": false, 00:17:29.019 "zone_management": false, 00:17:29.019 "zone_append": false, 00:17:29.019 "compare": false, 00:17:29.019 "compare_and_write": false, 00:17:29.019 "abort": false, 00:17:29.019 "seek_hole": false, 00:17:29.019 "seek_data": false, 00:17:29.019 "copy": false, 00:17:29.019 "nvme_iov_md": false 00:17:29.019 }, 00:17:29.019 "memory_domains": [ 00:17:29.019 { 00:17:29.019 "dma_device_id": "system", 00:17:29.019 "dma_device_type": 1 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.019 "dma_device_type": 2 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "dma_device_id": "system", 00:17:29.019 "dma_device_type": 1 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.019 "dma_device_type": 2 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "dma_device_id": "system", 00:17:29.019 "dma_device_type": 1 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.019 "dma_device_type": 2 00:17:29.019 } 00:17:29.019 ], 00:17:29.019 "driver_specific": { 00:17:29.019 "raid": { 00:17:29.019 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:29.019 "strip_size_kb": 0, 00:17:29.019 "state": "online", 00:17:29.019 "raid_level": "raid1", 00:17:29.019 "superblock": true, 00:17:29.019 "num_base_bdevs": 3, 00:17:29.019 "num_base_bdevs_discovered": 3, 00:17:29.019 "num_base_bdevs_operational": 3, 00:17:29.019 "base_bdevs_list": [ 00:17:29.019 { 00:17:29.019 "name": "pt1", 00:17:29.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:29.019 "is_configured": true, 00:17:29.019 "data_offset": 2048, 00:17:29.019 "data_size": 63488 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "name": "pt2", 00:17:29.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:29.019 "is_configured": true, 00:17:29.019 "data_offset": 2048, 00:17:29.019 "data_size": 63488 00:17:29.019 }, 00:17:29.019 { 00:17:29.019 "name": "pt3", 00:17:29.019 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:29.019 "is_configured": true, 00:17:29.019 "data_offset": 2048, 00:17:29.019 "data_size": 63488 00:17:29.019 } 00:17:29.019 ] 00:17:29.019 } 00:17:29.019 } 00:17:29.019 }' 00:17:29.019 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:29.019 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:29.019 pt2 00:17:29.019 pt3' 00:17:29.019 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.019 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:29.019 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.279 "name": "pt1", 00:17:29.279 "aliases": [ 00:17:29.279 "00000000-0000-0000-0000-000000000001" 00:17:29.279 ], 00:17:29.279 "product_name": "passthru", 00:17:29.279 "block_size": 512, 00:17:29.279 "num_blocks": 65536, 00:17:29.279 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:29.279 "assigned_rate_limits": { 00:17:29.279 "rw_ios_per_sec": 0, 00:17:29.279 "rw_mbytes_per_sec": 0, 00:17:29.279 "r_mbytes_per_sec": 0, 00:17:29.279 "w_mbytes_per_sec": 0 00:17:29.279 }, 00:17:29.279 "claimed": true, 00:17:29.279 "claim_type": "exclusive_write", 00:17:29.279 "zoned": false, 00:17:29.279 "supported_io_types": { 00:17:29.279 "read": true, 00:17:29.279 "write": true, 00:17:29.279 "unmap": true, 00:17:29.279 "flush": true, 00:17:29.279 "reset": true, 00:17:29.279 "nvme_admin": false, 00:17:29.279 "nvme_io": false, 00:17:29.279 "nvme_io_md": false, 00:17:29.279 "write_zeroes": true, 00:17:29.279 "zcopy": true, 00:17:29.279 "get_zone_info": false, 00:17:29.279 "zone_management": false, 00:17:29.279 "zone_append": false, 00:17:29.279 "compare": false, 00:17:29.279 "compare_and_write": false, 00:17:29.279 "abort": true, 00:17:29.279 "seek_hole": false, 00:17:29.279 "seek_data": false, 00:17:29.279 "copy": true, 00:17:29.279 "nvme_iov_md": false 00:17:29.279 }, 00:17:29.279 "memory_domains": [ 00:17:29.279 { 00:17:29.279 "dma_device_id": "system", 00:17:29.279 "dma_device_type": 1 00:17:29.279 }, 00:17:29.279 { 00:17:29.279 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.279 "dma_device_type": 2 00:17:29.279 } 00:17:29.279 ], 00:17:29.279 "driver_specific": { 00:17:29.279 "passthru": { 00:17:29.279 "name": "pt1", 00:17:29.279 "base_bdev_name": "malloc1" 00:17:29.279 } 00:17:29.279 } 00:17:29.279 }' 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.279 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:29.538 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:29.539 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:29.539 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:29.539 15:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:29.798 "name": "pt2", 00:17:29.798 "aliases": [ 00:17:29.798 "00000000-0000-0000-0000-000000000002" 00:17:29.798 ], 00:17:29.798 "product_name": "passthru", 00:17:29.798 "block_size": 512, 00:17:29.798 "num_blocks": 65536, 00:17:29.798 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:29.798 "assigned_rate_limits": { 00:17:29.798 "rw_ios_per_sec": 0, 00:17:29.798 "rw_mbytes_per_sec": 0, 00:17:29.798 "r_mbytes_per_sec": 0, 00:17:29.798 "w_mbytes_per_sec": 0 00:17:29.798 }, 00:17:29.798 "claimed": true, 00:17:29.798 "claim_type": "exclusive_write", 00:17:29.798 "zoned": false, 00:17:29.798 "supported_io_types": { 00:17:29.798 "read": true, 00:17:29.798 "write": true, 00:17:29.798 "unmap": true, 00:17:29.798 "flush": true, 00:17:29.798 "reset": true, 00:17:29.798 "nvme_admin": false, 00:17:29.798 "nvme_io": false, 00:17:29.798 "nvme_io_md": false, 00:17:29.798 "write_zeroes": true, 00:17:29.798 "zcopy": true, 00:17:29.798 "get_zone_info": false, 00:17:29.798 "zone_management": false, 00:17:29.798 "zone_append": false, 00:17:29.798 "compare": false, 00:17:29.798 "compare_and_write": false, 00:17:29.798 "abort": true, 00:17:29.798 "seek_hole": false, 00:17:29.798 "seek_data": false, 00:17:29.798 "copy": true, 00:17:29.798 "nvme_iov_md": false 00:17:29.798 }, 00:17:29.798 "memory_domains": [ 00:17:29.798 { 00:17:29.798 "dma_device_id": "system", 00:17:29.798 "dma_device_type": 1 00:17:29.798 }, 00:17:29.798 { 00:17:29.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.798 "dma_device_type": 2 00:17:29.798 } 00:17:29.798 ], 00:17:29.798 "driver_specific": { 00:17:29.798 "passthru": { 00:17:29.798 "name": "pt2", 00:17:29.798 "base_bdev_name": "malloc2" 00:17:29.798 } 00:17:29.798 } 00:17:29.798 }' 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:29.798 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:30.057 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:30.317 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:30.317 "name": "pt3", 00:17:30.317 "aliases": [ 00:17:30.317 "00000000-0000-0000-0000-000000000003" 00:17:30.317 ], 00:17:30.317 "product_name": "passthru", 00:17:30.317 "block_size": 512, 00:17:30.317 "num_blocks": 65536, 00:17:30.317 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:30.317 "assigned_rate_limits": { 00:17:30.317 "rw_ios_per_sec": 0, 00:17:30.317 "rw_mbytes_per_sec": 0, 00:17:30.317 "r_mbytes_per_sec": 0, 00:17:30.317 "w_mbytes_per_sec": 0 00:17:30.317 }, 00:17:30.317 "claimed": true, 00:17:30.317 "claim_type": "exclusive_write", 00:17:30.317 "zoned": false, 00:17:30.317 "supported_io_types": { 00:17:30.317 "read": true, 00:17:30.317 "write": true, 00:17:30.317 "unmap": true, 00:17:30.317 "flush": true, 00:17:30.317 "reset": true, 00:17:30.317 "nvme_admin": false, 00:17:30.317 "nvme_io": false, 00:17:30.317 "nvme_io_md": false, 00:17:30.317 "write_zeroes": true, 00:17:30.317 "zcopy": true, 00:17:30.317 "get_zone_info": false, 00:17:30.317 "zone_management": false, 00:17:30.317 "zone_append": false, 00:17:30.317 "compare": false, 00:17:30.317 "compare_and_write": false, 00:17:30.317 "abort": true, 00:17:30.317 "seek_hole": false, 00:17:30.317 "seek_data": false, 00:17:30.317 "copy": true, 00:17:30.317 "nvme_iov_md": false 00:17:30.317 }, 00:17:30.317 "memory_domains": [ 00:17:30.317 { 00:17:30.317 "dma_device_id": "system", 00:17:30.317 "dma_device_type": 1 00:17:30.317 }, 00:17:30.317 { 00:17:30.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:30.317 "dma_device_type": 2 00:17:30.317 } 00:17:30.317 ], 00:17:30.317 "driver_specific": { 00:17:30.317 "passthru": { 00:17:30.317 "name": "pt3", 00:17:30.317 "base_bdev_name": "malloc3" 00:17:30.317 } 00:17:30.317 } 00:17:30.317 }' 00:17:30.317 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.317 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:30.317 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:30.317 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.576 15:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:30.836 [2024-07-12 15:53:51.204365] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2273a614-62ae-48cc-bbaa-7b1d643bed10 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2273a614-62ae-48cc-bbaa-7b1d643bed10 ']' 00:17:30.836 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:31.095 [2024-07-12 15:53:51.400641] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:31.095 [2024-07-12 15:53:51.400653] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:31.095 [2024-07-12 15:53:51.400689] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:31.095 [2024-07-12 15:53:51.400744] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:31.095 [2024-07-12 15:53:51.400751] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138ccb0 name raid_bdev1, state offline 00:17:31.095 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:31.096 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.356 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:31.662 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:31.662 15:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:31.921 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:31.921 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:32.180 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:32.180 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:32.180 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:32.180 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:32.181 [2024-07-12 15:53:52.555521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:32.181 [2024-07-12 15:53:52.556572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:32.181 [2024-07-12 15:53:52.556604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:32.181 [2024-07-12 15:53:52.556638] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:32.181 [2024-07-12 15:53:52.556665] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:32.181 [2024-07-12 15:53:52.556678] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:32.181 [2024-07-12 15:53:52.556688] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:32.181 [2024-07-12 15:53:52.556694] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1398360 name raid_bdev1, state configuring 00:17:32.181 request: 00:17:32.181 { 00:17:32.181 "name": "raid_bdev1", 00:17:32.181 "raid_level": "raid1", 00:17:32.181 "base_bdevs": [ 00:17:32.181 "malloc1", 00:17:32.181 "malloc2", 00:17:32.181 "malloc3" 00:17:32.181 ], 00:17:32.181 "superblock": false, 00:17:32.181 "method": "bdev_raid_create", 00:17:32.181 "req_id": 1 00:17:32.181 } 00:17:32.181 Got JSON-RPC error response 00:17:32.181 response: 00:17:32.181 { 00:17:32.181 "code": -17, 00:17:32.181 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:32.181 } 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.181 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:32.442 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:32.442 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:32.442 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:32.717 [2024-07-12 15:53:52.940450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:32.717 [2024-07-12 15:53:52.940472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:32.717 [2024-07-12 15:53:52.940483] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138e990 00:17:32.717 [2024-07-12 15:53:52.940488] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:32.717 [2024-07-12 15:53:52.941742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:32.717 [2024-07-12 15:53:52.941762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:32.717 [2024-07-12 15:53:52.941804] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:32.717 [2024-07-12 15:53:52.941822] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:32.717 pt1 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.717 15:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:32.717 15:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:32.717 "name": "raid_bdev1", 00:17:32.717 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:32.717 "strip_size_kb": 0, 00:17:32.717 "state": "configuring", 00:17:32.717 "raid_level": "raid1", 00:17:32.717 "superblock": true, 00:17:32.717 "num_base_bdevs": 3, 00:17:32.717 "num_base_bdevs_discovered": 1, 00:17:32.717 "num_base_bdevs_operational": 3, 00:17:32.717 "base_bdevs_list": [ 00:17:32.717 { 00:17:32.717 "name": "pt1", 00:17:32.717 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:32.717 "is_configured": true, 00:17:32.717 "data_offset": 2048, 00:17:32.717 "data_size": 63488 00:17:32.717 }, 00:17:32.717 { 00:17:32.717 "name": null, 00:17:32.717 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:32.717 "is_configured": false, 00:17:32.717 "data_offset": 2048, 00:17:32.717 "data_size": 63488 00:17:32.717 }, 00:17:32.717 { 00:17:32.717 "name": null, 00:17:32.717 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:32.717 "is_configured": false, 00:17:32.717 "data_offset": 2048, 00:17:32.717 "data_size": 63488 00:17:32.717 } 00:17:32.717 ] 00:17:32.717 }' 00:17:32.717 15:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:32.717 15:53:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:33.288 15:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:33.288 15:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:33.548 [2024-07-12 15:53:53.891000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:33.548 [2024-07-12 15:53:53.891030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:33.548 [2024-07-12 15:53:53.891040] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138b140 00:17:33.548 [2024-07-12 15:53:53.891046] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:33.548 [2024-07-12 15:53:53.891294] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:33.548 [2024-07-12 15:53:53.891306] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:33.548 [2024-07-12 15:53:53.891346] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:33.548 [2024-07-12 15:53:53.891357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:33.548 pt2 00:17:33.548 15:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:33.808 [2024-07-12 15:53:54.075474] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.808 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:34.068 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:34.068 "name": "raid_bdev1", 00:17:34.068 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:34.068 "strip_size_kb": 0, 00:17:34.068 "state": "configuring", 00:17:34.068 "raid_level": "raid1", 00:17:34.068 "superblock": true, 00:17:34.068 "num_base_bdevs": 3, 00:17:34.068 "num_base_bdevs_discovered": 1, 00:17:34.068 "num_base_bdevs_operational": 3, 00:17:34.068 "base_bdevs_list": [ 00:17:34.068 { 00:17:34.068 "name": "pt1", 00:17:34.068 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:34.068 "is_configured": true, 00:17:34.068 "data_offset": 2048, 00:17:34.068 "data_size": 63488 00:17:34.068 }, 00:17:34.068 { 00:17:34.068 "name": null, 00:17:34.068 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:34.068 "is_configured": false, 00:17:34.068 "data_offset": 2048, 00:17:34.068 "data_size": 63488 00:17:34.068 }, 00:17:34.068 { 00:17:34.068 "name": null, 00:17:34.068 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:34.068 "is_configured": false, 00:17:34.068 "data_offset": 2048, 00:17:34.068 "data_size": 63488 00:17:34.068 } 00:17:34.068 ] 00:17:34.068 }' 00:17:34.068 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:34.068 15:53:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.637 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:34.637 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.637 15:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:34.638 [2024-07-12 15:53:55.021879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:34.638 [2024-07-12 15:53:55.021909] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.638 [2024-07-12 15:53:55.021921] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11de9e0 00:17:34.638 [2024-07-12 15:53:55.021927] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.638 [2024-07-12 15:53:55.022183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.638 [2024-07-12 15:53:55.022194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:34.638 [2024-07-12 15:53:55.022235] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:34.638 [2024-07-12 15:53:55.022248] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:34.638 pt2 00:17:34.638 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:34.638 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.638 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:34.898 [2024-07-12 15:53:55.218375] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:34.898 [2024-07-12 15:53:55.218393] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:34.898 [2024-07-12 15:53:55.218401] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11de000 00:17:34.898 [2024-07-12 15:53:55.218407] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:34.898 [2024-07-12 15:53:55.218623] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:34.898 [2024-07-12 15:53:55.218633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:34.898 [2024-07-12 15:53:55.218666] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:34.898 [2024-07-12 15:53:55.218675] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:34.898 [2024-07-12 15:53:55.218761] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x138b460 00:17:34.898 [2024-07-12 15:53:55.218768] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:34.898 [2024-07-12 15:53:55.218896] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1398a70 00:17:34.898 [2024-07-12 15:53:55.219001] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x138b460 00:17:34.898 [2024-07-12 15:53:55.219007] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x138b460 00:17:34.898 [2024-07-12 15:53:55.219078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:34.898 pt3 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.898 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:35.157 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.157 "name": "raid_bdev1", 00:17:35.157 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:35.157 "strip_size_kb": 0, 00:17:35.157 "state": "online", 00:17:35.157 "raid_level": "raid1", 00:17:35.157 "superblock": true, 00:17:35.157 "num_base_bdevs": 3, 00:17:35.157 "num_base_bdevs_discovered": 3, 00:17:35.157 "num_base_bdevs_operational": 3, 00:17:35.157 "base_bdevs_list": [ 00:17:35.157 { 00:17:35.157 "name": "pt1", 00:17:35.157 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.157 "is_configured": true, 00:17:35.157 "data_offset": 2048, 00:17:35.157 "data_size": 63488 00:17:35.157 }, 00:17:35.157 { 00:17:35.157 "name": "pt2", 00:17:35.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.157 "is_configured": true, 00:17:35.157 "data_offset": 2048, 00:17:35.157 "data_size": 63488 00:17:35.157 }, 00:17:35.157 { 00:17:35.157 "name": "pt3", 00:17:35.157 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.157 "is_configured": true, 00:17:35.157 "data_offset": 2048, 00:17:35.157 "data_size": 63488 00:17:35.157 } 00:17:35.157 ] 00:17:35.157 }' 00:17:35.157 15:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.157 15:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:35.726 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:35.985 [2024-07-12 15:53:56.241278] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:35.985 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:35.985 "name": "raid_bdev1", 00:17:35.985 "aliases": [ 00:17:35.985 "2273a614-62ae-48cc-bbaa-7b1d643bed10" 00:17:35.985 ], 00:17:35.985 "product_name": "Raid Volume", 00:17:35.985 "block_size": 512, 00:17:35.985 "num_blocks": 63488, 00:17:35.985 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:35.985 "assigned_rate_limits": { 00:17:35.985 "rw_ios_per_sec": 0, 00:17:35.985 "rw_mbytes_per_sec": 0, 00:17:35.986 "r_mbytes_per_sec": 0, 00:17:35.986 "w_mbytes_per_sec": 0 00:17:35.986 }, 00:17:35.986 "claimed": false, 00:17:35.986 "zoned": false, 00:17:35.986 "supported_io_types": { 00:17:35.986 "read": true, 00:17:35.986 "write": true, 00:17:35.986 "unmap": false, 00:17:35.986 "flush": false, 00:17:35.986 "reset": true, 00:17:35.986 "nvme_admin": false, 00:17:35.986 "nvme_io": false, 00:17:35.986 "nvme_io_md": false, 00:17:35.986 "write_zeroes": true, 00:17:35.986 "zcopy": false, 00:17:35.986 "get_zone_info": false, 00:17:35.986 "zone_management": false, 00:17:35.986 "zone_append": false, 00:17:35.986 "compare": false, 00:17:35.986 "compare_and_write": false, 00:17:35.986 "abort": false, 00:17:35.986 "seek_hole": false, 00:17:35.986 "seek_data": false, 00:17:35.986 "copy": false, 00:17:35.986 "nvme_iov_md": false 00:17:35.986 }, 00:17:35.986 "memory_domains": [ 00:17:35.986 { 00:17:35.986 "dma_device_id": "system", 00:17:35.986 "dma_device_type": 1 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.986 "dma_device_type": 2 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "dma_device_id": "system", 00:17:35.986 "dma_device_type": 1 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.986 "dma_device_type": 2 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "dma_device_id": "system", 00:17:35.986 "dma_device_type": 1 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:35.986 "dma_device_type": 2 00:17:35.986 } 00:17:35.986 ], 00:17:35.986 "driver_specific": { 00:17:35.986 "raid": { 00:17:35.986 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:35.986 "strip_size_kb": 0, 00:17:35.986 "state": "online", 00:17:35.986 "raid_level": "raid1", 00:17:35.986 "superblock": true, 00:17:35.986 "num_base_bdevs": 3, 00:17:35.986 "num_base_bdevs_discovered": 3, 00:17:35.986 "num_base_bdevs_operational": 3, 00:17:35.986 "base_bdevs_list": [ 00:17:35.986 { 00:17:35.986 "name": "pt1", 00:17:35.986 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:35.986 "is_configured": true, 00:17:35.986 "data_offset": 2048, 00:17:35.986 "data_size": 63488 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "name": "pt2", 00:17:35.986 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:35.986 "is_configured": true, 00:17:35.986 "data_offset": 2048, 00:17:35.986 "data_size": 63488 00:17:35.986 }, 00:17:35.986 { 00:17:35.986 "name": "pt3", 00:17:35.986 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:35.986 "is_configured": true, 00:17:35.986 "data_offset": 2048, 00:17:35.986 "data_size": 63488 00:17:35.986 } 00:17:35.986 ] 00:17:35.986 } 00:17:35.986 } 00:17:35.986 }' 00:17:35.986 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:35.986 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:35.986 pt2 00:17:35.986 pt3' 00:17:35.986 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:35.986 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:35.986 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:36.246 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:36.246 "name": "pt1", 00:17:36.246 "aliases": [ 00:17:36.246 "00000000-0000-0000-0000-000000000001" 00:17:36.246 ], 00:17:36.246 "product_name": "passthru", 00:17:36.246 "block_size": 512, 00:17:36.246 "num_blocks": 65536, 00:17:36.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:36.246 "assigned_rate_limits": { 00:17:36.246 "rw_ios_per_sec": 0, 00:17:36.246 "rw_mbytes_per_sec": 0, 00:17:36.246 "r_mbytes_per_sec": 0, 00:17:36.246 "w_mbytes_per_sec": 0 00:17:36.246 }, 00:17:36.246 "claimed": true, 00:17:36.246 "claim_type": "exclusive_write", 00:17:36.246 "zoned": false, 00:17:36.246 "supported_io_types": { 00:17:36.246 "read": true, 00:17:36.246 "write": true, 00:17:36.246 "unmap": true, 00:17:36.247 "flush": true, 00:17:36.247 "reset": true, 00:17:36.247 "nvme_admin": false, 00:17:36.247 "nvme_io": false, 00:17:36.247 "nvme_io_md": false, 00:17:36.247 "write_zeroes": true, 00:17:36.247 "zcopy": true, 00:17:36.247 "get_zone_info": false, 00:17:36.247 "zone_management": false, 00:17:36.247 "zone_append": false, 00:17:36.247 "compare": false, 00:17:36.247 "compare_and_write": false, 00:17:36.247 "abort": true, 00:17:36.247 "seek_hole": false, 00:17:36.247 "seek_data": false, 00:17:36.247 "copy": true, 00:17:36.247 "nvme_iov_md": false 00:17:36.247 }, 00:17:36.247 "memory_domains": [ 00:17:36.247 { 00:17:36.247 "dma_device_id": "system", 00:17:36.247 "dma_device_type": 1 00:17:36.247 }, 00:17:36.247 { 00:17:36.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.247 "dma_device_type": 2 00:17:36.247 } 00:17:36.247 ], 00:17:36.247 "driver_specific": { 00:17:36.247 "passthru": { 00:17:36.247 "name": "pt1", 00:17:36.247 "base_bdev_name": "malloc1" 00:17:36.247 } 00:17:36.247 } 00:17:36.247 }' 00:17:36.247 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.247 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:36.247 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:36.247 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.247 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:36.506 15:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:37.076 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:37.076 "name": "pt2", 00:17:37.076 "aliases": [ 00:17:37.076 "00000000-0000-0000-0000-000000000002" 00:17:37.076 ], 00:17:37.076 "product_name": "passthru", 00:17:37.076 "block_size": 512, 00:17:37.076 "num_blocks": 65536, 00:17:37.076 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:37.076 "assigned_rate_limits": { 00:17:37.076 "rw_ios_per_sec": 0, 00:17:37.076 "rw_mbytes_per_sec": 0, 00:17:37.076 "r_mbytes_per_sec": 0, 00:17:37.076 "w_mbytes_per_sec": 0 00:17:37.076 }, 00:17:37.076 "claimed": true, 00:17:37.076 "claim_type": "exclusive_write", 00:17:37.076 "zoned": false, 00:17:37.076 "supported_io_types": { 00:17:37.076 "read": true, 00:17:37.076 "write": true, 00:17:37.076 "unmap": true, 00:17:37.076 "flush": true, 00:17:37.076 "reset": true, 00:17:37.076 "nvme_admin": false, 00:17:37.076 "nvme_io": false, 00:17:37.076 "nvme_io_md": false, 00:17:37.076 "write_zeroes": true, 00:17:37.076 "zcopy": true, 00:17:37.076 "get_zone_info": false, 00:17:37.076 "zone_management": false, 00:17:37.076 "zone_append": false, 00:17:37.076 "compare": false, 00:17:37.076 "compare_and_write": false, 00:17:37.076 "abort": true, 00:17:37.076 "seek_hole": false, 00:17:37.076 "seek_data": false, 00:17:37.076 "copy": true, 00:17:37.076 "nvme_iov_md": false 00:17:37.076 }, 00:17:37.076 "memory_domains": [ 00:17:37.076 { 00:17:37.076 "dma_device_id": "system", 00:17:37.076 "dma_device_type": 1 00:17:37.076 }, 00:17:37.076 { 00:17:37.076 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.076 "dma_device_type": 2 00:17:37.076 } 00:17:37.076 ], 00:17:37.076 "driver_specific": { 00:17:37.076 "passthru": { 00:17:37.076 "name": "pt2", 00:17:37.076 "base_bdev_name": "malloc2" 00:17:37.076 } 00:17:37.076 } 00:17:37.076 }' 00:17:37.076 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.076 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.076 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:37.076 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.336 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:37.596 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:37.596 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:37.596 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:37.596 15:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:37.596 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:37.596 "name": "pt3", 00:17:37.596 "aliases": [ 00:17:37.596 "00000000-0000-0000-0000-000000000003" 00:17:37.596 ], 00:17:37.596 "product_name": "passthru", 00:17:37.596 "block_size": 512, 00:17:37.596 "num_blocks": 65536, 00:17:37.596 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:37.596 "assigned_rate_limits": { 00:17:37.596 "rw_ios_per_sec": 0, 00:17:37.596 "rw_mbytes_per_sec": 0, 00:17:37.596 "r_mbytes_per_sec": 0, 00:17:37.596 "w_mbytes_per_sec": 0 00:17:37.596 }, 00:17:37.596 "claimed": true, 00:17:37.596 "claim_type": "exclusive_write", 00:17:37.596 "zoned": false, 00:17:37.596 "supported_io_types": { 00:17:37.596 "read": true, 00:17:37.596 "write": true, 00:17:37.596 "unmap": true, 00:17:37.596 "flush": true, 00:17:37.596 "reset": true, 00:17:37.596 "nvme_admin": false, 00:17:37.596 "nvme_io": false, 00:17:37.596 "nvme_io_md": false, 00:17:37.596 "write_zeroes": true, 00:17:37.596 "zcopy": true, 00:17:37.596 "get_zone_info": false, 00:17:37.596 "zone_management": false, 00:17:37.596 "zone_append": false, 00:17:37.596 "compare": false, 00:17:37.596 "compare_and_write": false, 00:17:37.596 "abort": true, 00:17:37.596 "seek_hole": false, 00:17:37.596 "seek_data": false, 00:17:37.596 "copy": true, 00:17:37.596 "nvme_iov_md": false 00:17:37.596 }, 00:17:37.596 "memory_domains": [ 00:17:37.596 { 00:17:37.596 "dma_device_id": "system", 00:17:37.596 "dma_device_type": 1 00:17:37.596 }, 00:17:37.596 { 00:17:37.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.596 "dma_device_type": 2 00:17:37.596 } 00:17:37.596 ], 00:17:37.596 "driver_specific": { 00:17:37.596 "passthru": { 00:17:37.596 "name": "pt3", 00:17:37.596 "base_bdev_name": "malloc3" 00:17:37.596 } 00:17:37.596 } 00:17:37.596 }' 00:17:37.596 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:37.856 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:38.115 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:38.375 [2024-07-12 15:53:58.599229] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2273a614-62ae-48cc-bbaa-7b1d643bed10 '!=' 2273a614-62ae-48cc-bbaa-7b1d643bed10 ']' 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:38.375 [2024-07-12 15:53:58.791508] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.375 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.636 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.636 "name": "raid_bdev1", 00:17:38.636 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:38.636 "strip_size_kb": 0, 00:17:38.636 "state": "online", 00:17:38.636 "raid_level": "raid1", 00:17:38.636 "superblock": true, 00:17:38.636 "num_base_bdevs": 3, 00:17:38.636 "num_base_bdevs_discovered": 2, 00:17:38.636 "num_base_bdevs_operational": 2, 00:17:38.636 "base_bdevs_list": [ 00:17:38.636 { 00:17:38.636 "name": null, 00:17:38.636 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.636 "is_configured": false, 00:17:38.636 "data_offset": 2048, 00:17:38.636 "data_size": 63488 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "name": "pt2", 00:17:38.636 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:38.636 "is_configured": true, 00:17:38.636 "data_offset": 2048, 00:17:38.636 "data_size": 63488 00:17:38.636 }, 00:17:38.636 { 00:17:38.636 "name": "pt3", 00:17:38.636 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:38.636 "is_configured": true, 00:17:38.636 "data_offset": 2048, 00:17:38.636 "data_size": 63488 00:17:38.636 } 00:17:38.636 ] 00:17:38.636 }' 00:17:38.636 15:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.636 15:53:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.209 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:39.469 [2024-07-12 15:53:59.705814] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:39.469 [2024-07-12 15:53:59.705833] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:39.469 [2024-07-12 15:53:59.705870] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:39.469 [2024-07-12 15:53:59.705911] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:39.469 [2024-07-12 15:53:59.705917] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x138b460 name raid_bdev1, state offline 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:39.469 15:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:39.730 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:39.730 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:39.730 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:39.990 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:17:39.990 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:17:39.990 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:17:39.990 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:39.990 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:40.251 [2024-07-12 15:54:00.463704] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:40.251 [2024-07-12 15:54:00.463749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:40.251 [2024-07-12 15:54:00.463760] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1398360 00:17:40.251 [2024-07-12 15:54:00.463767] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:40.251 [2024-07-12 15:54:00.465094] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:40.251 [2024-07-12 15:54:00.465116] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:40.251 [2024-07-12 15:54:00.465162] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:40.251 [2024-07-12 15:54:00.465182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:40.251 pt2 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:40.251 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.251 "name": "raid_bdev1", 00:17:40.251 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:40.251 "strip_size_kb": 0, 00:17:40.251 "state": "configuring", 00:17:40.251 "raid_level": "raid1", 00:17:40.251 "superblock": true, 00:17:40.251 "num_base_bdevs": 3, 00:17:40.251 "num_base_bdevs_discovered": 1, 00:17:40.252 "num_base_bdevs_operational": 2, 00:17:40.252 "base_bdevs_list": [ 00:17:40.252 { 00:17:40.252 "name": null, 00:17:40.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.252 "is_configured": false, 00:17:40.252 "data_offset": 2048, 00:17:40.252 "data_size": 63488 00:17:40.252 }, 00:17:40.252 { 00:17:40.252 "name": "pt2", 00:17:40.252 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:40.252 "is_configured": true, 00:17:40.252 "data_offset": 2048, 00:17:40.252 "data_size": 63488 00:17:40.252 }, 00:17:40.252 { 00:17:40.252 "name": null, 00:17:40.252 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:40.252 "is_configured": false, 00:17:40.252 "data_offset": 2048, 00:17:40.252 "data_size": 63488 00:17:40.252 } 00:17:40.252 ] 00:17:40.252 }' 00:17:40.252 15:54:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.252 15:54:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:41.192 [2024-07-12 15:54:01.554475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:41.192 [2024-07-12 15:54:01.554513] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:41.192 [2024-07-12 15:54:01.554525] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x138e390 00:17:41.192 [2024-07-12 15:54:01.554532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:41.192 [2024-07-12 15:54:01.554812] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:41.192 [2024-07-12 15:54:01.554825] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:41.192 [2024-07-12 15:54:01.554869] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:41.192 [2024-07-12 15:54:01.554883] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:41.192 [2024-07-12 15:54:01.554962] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11e6d80 00:17:41.192 [2024-07-12 15:54:01.554969] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:41.192 [2024-07-12 15:54:01.555104] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e7600 00:17:41.192 [2024-07-12 15:54:01.555202] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11e6d80 00:17:41.192 [2024-07-12 15:54:01.555207] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11e6d80 00:17:41.192 [2024-07-12 15:54:01.555278] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:41.192 pt3 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.192 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:41.452 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.452 "name": "raid_bdev1", 00:17:41.452 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:41.452 "strip_size_kb": 0, 00:17:41.452 "state": "online", 00:17:41.452 "raid_level": "raid1", 00:17:41.452 "superblock": true, 00:17:41.452 "num_base_bdevs": 3, 00:17:41.452 "num_base_bdevs_discovered": 2, 00:17:41.452 "num_base_bdevs_operational": 2, 00:17:41.452 "base_bdevs_list": [ 00:17:41.452 { 00:17:41.452 "name": null, 00:17:41.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.452 "is_configured": false, 00:17:41.452 "data_offset": 2048, 00:17:41.452 "data_size": 63488 00:17:41.452 }, 00:17:41.452 { 00:17:41.452 "name": "pt2", 00:17:41.452 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:41.452 "is_configured": true, 00:17:41.452 "data_offset": 2048, 00:17:41.452 "data_size": 63488 00:17:41.452 }, 00:17:41.452 { 00:17:41.452 "name": "pt3", 00:17:41.452 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:41.452 "is_configured": true, 00:17:41.452 "data_offset": 2048, 00:17:41.452 "data_size": 63488 00:17:41.452 } 00:17:41.452 ] 00:17:41.452 }' 00:17:41.452 15:54:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.452 15:54:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.022 15:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:42.281 [2024-07-12 15:54:02.496847] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:42.281 [2024-07-12 15:54:02.496862] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:42.281 [2024-07-12 15:54:02.496899] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:42.281 [2024-07-12 15:54:02.496939] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:42.281 [2024-07-12 15:54:02.496945] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e6d80 name raid_bdev1, state offline 00:17:42.281 15:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.281 15:54:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:42.851 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:43.111 [2024-07-12 15:54:03.423141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:43.111 [2024-07-12 15:54:03.423168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:43.111 [2024-07-12 15:54:03.423177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11de770 00:17:43.111 [2024-07-12 15:54:03.423184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:43.111 [2024-07-12 15:54:03.424420] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:43.112 [2024-07-12 15:54:03.424441] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:43.112 [2024-07-12 15:54:03.424483] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:43.112 [2024-07-12 15:54:03.424501] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:43.112 [2024-07-12 15:54:03.424573] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:43.112 [2024-07-12 15:54:03.424580] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:43.112 [2024-07-12 15:54:03.424588] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e1660 name raid_bdev1, state configuring 00:17:43.112 [2024-07-12 15:54:03.424602] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:43.112 pt1 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.112 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:43.372 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.372 "name": "raid_bdev1", 00:17:43.372 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:43.372 "strip_size_kb": 0, 00:17:43.372 "state": "configuring", 00:17:43.372 "raid_level": "raid1", 00:17:43.372 "superblock": true, 00:17:43.372 "num_base_bdevs": 3, 00:17:43.372 "num_base_bdevs_discovered": 1, 00:17:43.372 "num_base_bdevs_operational": 2, 00:17:43.372 "base_bdevs_list": [ 00:17:43.372 { 00:17:43.372 "name": null, 00:17:43.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:43.372 "is_configured": false, 00:17:43.372 "data_offset": 2048, 00:17:43.372 "data_size": 63488 00:17:43.372 }, 00:17:43.372 { 00:17:43.372 "name": "pt2", 00:17:43.372 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:43.372 "is_configured": true, 00:17:43.372 "data_offset": 2048, 00:17:43.372 "data_size": 63488 00:17:43.372 }, 00:17:43.372 { 00:17:43.372 "name": null, 00:17:43.372 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:43.372 "is_configured": false, 00:17:43.372 "data_offset": 2048, 00:17:43.372 "data_size": 63488 00:17:43.372 } 00:17:43.372 ] 00:17:43.372 }' 00:17:43.373 15:54:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.373 15:54:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.942 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:43.942 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:43.942 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:17:43.943 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:44.203 [2024-07-12 15:54:04.554000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:44.203 [2024-07-12 15:54:04.554031] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:44.203 [2024-07-12 15:54:04.554044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ddd40 00:17:44.203 [2024-07-12 15:54:04.554051] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:44.203 [2024-07-12 15:54:04.554315] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:44.203 [2024-07-12 15:54:04.554326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:44.203 [2024-07-12 15:54:04.554369] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:44.203 [2024-07-12 15:54:04.554382] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:44.203 [2024-07-12 15:54:04.554457] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11e18e0 00:17:44.203 [2024-07-12 15:54:04.554463] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:44.203 [2024-07-12 15:54:04.554598] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11e6820 00:17:44.203 [2024-07-12 15:54:04.554696] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11e18e0 00:17:44.203 [2024-07-12 15:54:04.554702] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11e18e0 00:17:44.203 [2024-07-12 15:54:04.554782] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:44.203 pt3 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.203 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:44.462 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.462 "name": "raid_bdev1", 00:17:44.462 "uuid": "2273a614-62ae-48cc-bbaa-7b1d643bed10", 00:17:44.462 "strip_size_kb": 0, 00:17:44.462 "state": "online", 00:17:44.462 "raid_level": "raid1", 00:17:44.462 "superblock": true, 00:17:44.462 "num_base_bdevs": 3, 00:17:44.462 "num_base_bdevs_discovered": 2, 00:17:44.462 "num_base_bdevs_operational": 2, 00:17:44.462 "base_bdevs_list": [ 00:17:44.462 { 00:17:44.462 "name": null, 00:17:44.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.462 "is_configured": false, 00:17:44.462 "data_offset": 2048, 00:17:44.462 "data_size": 63488 00:17:44.462 }, 00:17:44.462 { 00:17:44.462 "name": "pt2", 00:17:44.462 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:44.462 "is_configured": true, 00:17:44.462 "data_offset": 2048, 00:17:44.462 "data_size": 63488 00:17:44.462 }, 00:17:44.462 { 00:17:44.462 "name": "pt3", 00:17:44.462 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:44.462 "is_configured": true, 00:17:44.462 "data_offset": 2048, 00:17:44.462 "data_size": 63488 00:17:44.462 } 00:17:44.462 ] 00:17:44.462 }' 00:17:44.462 15:54:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.462 15:54:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.030 15:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:45.030 15:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:45.634 15:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:17:45.634 15:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:45.634 15:54:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:17:45.634 [2024-07-12 15:54:06.025931] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:45.634 15:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2273a614-62ae-48cc-bbaa-7b1d643bed10 '!=' 2273a614-62ae-48cc-bbaa-7b1d643bed10 ']' 00:17:45.634 15:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2561663 00:17:45.634 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2561663 ']' 00:17:45.634 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2561663 00:17:45.634 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:45.635 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:45.635 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2561663 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2561663' 00:17:45.894 killing process with pid 2561663 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2561663 00:17:45.894 [2024-07-12 15:54:06.092939] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:45.894 [2024-07-12 15:54:06.092979] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:45.894 [2024-07-12 15:54:06.093020] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:45.894 [2024-07-12 15:54:06.093026] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11e18e0 name raid_bdev1, state offline 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2561663 00:17:45.894 [2024-07-12 15:54:06.108193] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:45.894 00:17:45.894 real 0m19.962s 00:17:45.894 user 0m37.527s 00:17:45.894 sys 0m2.777s 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:45.894 15:54:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:45.894 ************************************ 00:17:45.894 END TEST raid_superblock_test 00:17:45.894 ************************************ 00:17:45.894 15:54:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:45.894 15:54:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:45.894 15:54:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:45.894 15:54:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:45.894 15:54:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:45.894 ************************************ 00:17:45.894 START TEST raid_read_error_test 00:17:45.894 ************************************ 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.TGg6wDIbAS 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2565465 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2565465 /var/tmp/spdk-raid.sock 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2565465 ']' 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:45.894 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:45.894 15:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.153 [2024-07-12 15:54:06.381184] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:17:46.153 [2024-07-12 15:54:06.381243] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2565465 ] 00:17:46.153 [2024-07-12 15:54:06.472656] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:46.153 [2024-07-12 15:54:06.550682] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:46.153 [2024-07-12 15:54:06.591890] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:46.153 [2024-07-12 15:54:06.591916] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:47.091 15:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:47.091 15:54:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:47.091 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:47.091 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:47.091 BaseBdev1_malloc 00:17:47.091 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:47.350 true 00:17:47.350 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:47.350 [2024-07-12 15:54:07.787506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:47.350 [2024-07-12 15:54:07.787537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:47.350 [2024-07-12 15:54:07.787549] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27dbaa0 00:17:47.350 [2024-07-12 15:54:07.787556] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:47.350 [2024-07-12 15:54:07.788774] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:47.350 [2024-07-12 15:54:07.788793] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:47.350 BaseBdev1 00:17:47.611 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:47.611 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:47.611 BaseBdev2_malloc 00:17:47.611 15:54:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:47.871 true 00:17:47.871 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:48.131 [2024-07-12 15:54:08.362740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:48.131 [2024-07-12 15:54:08.362766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.131 [2024-07-12 15:54:08.362777] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e0e40 00:17:48.131 [2024-07-12 15:54:08.362784] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.131 [2024-07-12 15:54:08.363927] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.131 [2024-07-12 15:54:08.363946] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:48.131 BaseBdev2 00:17:48.131 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:48.131 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:48.131 BaseBdev3_malloc 00:17:48.390 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:48.390 true 00:17:48.390 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:48.685 [2024-07-12 15:54:08.937892] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:48.685 [2024-07-12 15:54:08.937918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:48.685 [2024-07-12 15:54:08.937929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27e27f0 00:17:48.685 [2024-07-12 15:54:08.937936] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:48.685 [2024-07-12 15:54:08.939074] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:48.685 [2024-07-12 15:54:08.939093] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:48.685 BaseBdev3 00:17:48.685 15:54:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:48.685 [2024-07-12 15:54:09.122378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:48.685 [2024-07-12 15:54:09.123359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:48.685 [2024-07-12 15:54:09.123413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:48.685 [2024-07-12 15:54:09.123566] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27e0750 00:17:48.685 [2024-07-12 15:54:09.123573] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:48.685 [2024-07-12 15:54:09.123721] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27e3970 00:17:48.685 [2024-07-12 15:54:09.123842] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27e0750 00:17:48.685 [2024-07-12 15:54:09.123848] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27e0750 00:17:48.685 [2024-07-12 15:54:09.123922] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.946 "name": "raid_bdev1", 00:17:48.946 "uuid": "8a911a0b-5895-4f46-9fa5-c33efb05e7f4", 00:17:48.946 "strip_size_kb": 0, 00:17:48.946 "state": "online", 00:17:48.946 "raid_level": "raid1", 00:17:48.946 "superblock": true, 00:17:48.946 "num_base_bdevs": 3, 00:17:48.946 "num_base_bdevs_discovered": 3, 00:17:48.946 "num_base_bdevs_operational": 3, 00:17:48.946 "base_bdevs_list": [ 00:17:48.946 { 00:17:48.946 "name": "BaseBdev1", 00:17:48.946 "uuid": "41798c14-2455-5450-a707-e33d1a3d28a9", 00:17:48.946 "is_configured": true, 00:17:48.946 "data_offset": 2048, 00:17:48.946 "data_size": 63488 00:17:48.946 }, 00:17:48.946 { 00:17:48.946 "name": "BaseBdev2", 00:17:48.946 "uuid": "02f6db20-1703-5759-b840-717f40c037db", 00:17:48.946 "is_configured": true, 00:17:48.946 "data_offset": 2048, 00:17:48.946 "data_size": 63488 00:17:48.946 }, 00:17:48.946 { 00:17:48.946 "name": "BaseBdev3", 00:17:48.946 "uuid": "2018df73-f494-50b6-b3c4-6ec449da9699", 00:17:48.946 "is_configured": true, 00:17:48.946 "data_offset": 2048, 00:17:48.946 "data_size": 63488 00:17:48.946 } 00:17:48.946 ] 00:17:48.946 }' 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.946 15:54:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.516 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:49.516 15:54:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:49.775 [2024-07-12 15:54:09.984784] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2636df0 00:17:50.716 15:54:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.716 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:50.976 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.976 "name": "raid_bdev1", 00:17:50.976 "uuid": "8a911a0b-5895-4f46-9fa5-c33efb05e7f4", 00:17:50.976 "strip_size_kb": 0, 00:17:50.976 "state": "online", 00:17:50.976 "raid_level": "raid1", 00:17:50.976 "superblock": true, 00:17:50.976 "num_base_bdevs": 3, 00:17:50.976 "num_base_bdevs_discovered": 3, 00:17:50.976 "num_base_bdevs_operational": 3, 00:17:50.976 "base_bdevs_list": [ 00:17:50.976 { 00:17:50.976 "name": "BaseBdev1", 00:17:50.976 "uuid": "41798c14-2455-5450-a707-e33d1a3d28a9", 00:17:50.976 "is_configured": true, 00:17:50.976 "data_offset": 2048, 00:17:50.976 "data_size": 63488 00:17:50.976 }, 00:17:50.976 { 00:17:50.976 "name": "BaseBdev2", 00:17:50.976 "uuid": "02f6db20-1703-5759-b840-717f40c037db", 00:17:50.976 "is_configured": true, 00:17:50.976 "data_offset": 2048, 00:17:50.976 "data_size": 63488 00:17:50.976 }, 00:17:50.976 { 00:17:50.976 "name": "BaseBdev3", 00:17:50.976 "uuid": "2018df73-f494-50b6-b3c4-6ec449da9699", 00:17:50.976 "is_configured": true, 00:17:50.976 "data_offset": 2048, 00:17:50.976 "data_size": 63488 00:17:50.976 } 00:17:50.976 ] 00:17:50.976 }' 00:17:50.976 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.976 15:54:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.546 15:54:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:51.807 [2024-07-12 15:54:11.998212] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:51.807 [2024-07-12 15:54:11.998243] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:51.807 [2024-07-12 15:54:12.000807] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:51.807 [2024-07-12 15:54:12.000832] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:51.807 [2024-07-12 15:54:12.000909] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:51.807 [2024-07-12 15:54:12.000916] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27e0750 name raid_bdev1, state offline 00:17:51.807 0 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2565465 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2565465 ']' 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2565465 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2565465 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2565465' 00:17:51.807 killing process with pid 2565465 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2565465 00:17:51.807 [2024-07-12 15:54:12.084587] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2565465 00:17:51.807 [2024-07-12 15:54:12.096199] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.TGg6wDIbAS 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:51.807 00:17:51.807 real 0m5.921s 00:17:51.807 user 0m9.387s 00:17:51.807 sys 0m0.882s 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:51.807 15:54:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.807 ************************************ 00:17:51.807 END TEST raid_read_error_test 00:17:51.807 ************************************ 00:17:52.069 15:54:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:52.069 15:54:12 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:52.069 15:54:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:52.069 15:54:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:52.069 15:54:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:52.069 ************************************ 00:17:52.069 START TEST raid_write_error_test 00:17:52.069 ************************************ 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2LVK2kqLt2 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2567077 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2567077 /var/tmp/spdk-raid.sock 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2567077 ']' 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:52.069 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:52.069 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.069 [2024-07-12 15:54:12.376111] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:17:52.069 [2024-07-12 15:54:12.376168] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2567077 ] 00:17:52.069 [2024-07-12 15:54:12.467868] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:52.329 [2024-07-12 15:54:12.544049] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:52.329 [2024-07-12 15:54:12.590921] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.329 [2024-07-12 15:54:12.590947] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.590 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:52.590 15:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:52.590 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.590 15:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:52.849 BaseBdev1_malloc 00:17:52.849 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:52.849 true 00:17:52.849 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:53.109 [2024-07-12 15:54:13.422068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:53.109 [2024-07-12 15:54:13.422099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.109 [2024-07-12 15:54:13.422110] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x269eaa0 00:17:53.109 [2024-07-12 15:54:13.422117] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.109 [2024-07-12 15:54:13.423354] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.109 [2024-07-12 15:54:13.423374] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:53.109 BaseBdev1 00:17:53.109 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:53.109 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:53.368 BaseBdev2_malloc 00:17:53.368 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:53.368 true 00:17:53.628 15:54:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:53.628 [2024-07-12 15:54:13.985024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:53.628 [2024-07-12 15:54:13.985049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.628 [2024-07-12 15:54:13.985059] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a3e40 00:17:53.628 [2024-07-12 15:54:13.985065] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.628 [2024-07-12 15:54:13.986206] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.628 [2024-07-12 15:54:13.986225] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:53.628 BaseBdev2 00:17:53.628 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:53.628 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:53.887 BaseBdev3_malloc 00:17:53.887 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:54.147 true 00:17:54.147 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:54.147 [2024-07-12 15:54:14.547995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:54.147 [2024-07-12 15:54:14.548019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.147 [2024-07-12 15:54:14.548030] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a57f0 00:17:54.147 [2024-07-12 15:54:14.548037] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.147 [2024-07-12 15:54:14.549176] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.147 [2024-07-12 15:54:14.549194] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:54.147 BaseBdev3 00:17:54.147 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:54.407 [2024-07-12 15:54:14.732481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:54.407 [2024-07-12 15:54:14.733452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:54.407 [2024-07-12 15:54:14.733505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:54.407 [2024-07-12 15:54:14.733656] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a3750 00:17:54.407 [2024-07-12 15:54:14.733663] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:54.407 [2024-07-12 15:54:14.733808] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a6970 00:17:54.407 [2024-07-12 15:54:14.733927] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a3750 00:17:54.407 [2024-07-12 15:54:14.733933] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a3750 00:17:54.407 [2024-07-12 15:54:14.734007] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.407 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:54.667 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.667 "name": "raid_bdev1", 00:17:54.667 "uuid": "a33b33dd-b739-45c4-9029-f07b584cdb6d", 00:17:54.667 "strip_size_kb": 0, 00:17:54.667 "state": "online", 00:17:54.667 "raid_level": "raid1", 00:17:54.667 "superblock": true, 00:17:54.667 "num_base_bdevs": 3, 00:17:54.667 "num_base_bdevs_discovered": 3, 00:17:54.667 "num_base_bdevs_operational": 3, 00:17:54.667 "base_bdevs_list": [ 00:17:54.667 { 00:17:54.667 "name": "BaseBdev1", 00:17:54.667 "uuid": "69b693eb-5002-50d9-9fd1-2b65e28ff7bd", 00:17:54.667 "is_configured": true, 00:17:54.667 "data_offset": 2048, 00:17:54.667 "data_size": 63488 00:17:54.667 }, 00:17:54.667 { 00:17:54.667 "name": "BaseBdev2", 00:17:54.667 "uuid": "7e90efe2-c7bb-5847-8635-3ae4971e97c0", 00:17:54.667 "is_configured": true, 00:17:54.667 "data_offset": 2048, 00:17:54.667 "data_size": 63488 00:17:54.667 }, 00:17:54.667 { 00:17:54.667 "name": "BaseBdev3", 00:17:54.667 "uuid": "f1a52dee-121d-5f38-8a83-80792841c931", 00:17:54.667 "is_configured": true, 00:17:54.667 "data_offset": 2048, 00:17:54.667 "data_size": 63488 00:17:54.667 } 00:17:54.667 ] 00:17:54.667 }' 00:17:54.667 15:54:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.667 15:54:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:55.236 15:54:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:55.236 15:54:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:55.236 [2024-07-12 15:54:15.606925] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24f9df0 00:17:56.175 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:56.434 [2024-07-12 15:54:16.702542] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:56.434 [2024-07-12 15:54:16.702587] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:56.434 [2024-07-12 15:54:16.702771] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x24f9df0 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.434 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:56.694 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.694 "name": "raid_bdev1", 00:17:56.694 "uuid": "a33b33dd-b739-45c4-9029-f07b584cdb6d", 00:17:56.694 "strip_size_kb": 0, 00:17:56.694 "state": "online", 00:17:56.694 "raid_level": "raid1", 00:17:56.694 "superblock": true, 00:17:56.694 "num_base_bdevs": 3, 00:17:56.694 "num_base_bdevs_discovered": 2, 00:17:56.694 "num_base_bdevs_operational": 2, 00:17:56.694 "base_bdevs_list": [ 00:17:56.694 { 00:17:56.694 "name": null, 00:17:56.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.694 "is_configured": false, 00:17:56.694 "data_offset": 2048, 00:17:56.694 "data_size": 63488 00:17:56.694 }, 00:17:56.694 { 00:17:56.694 "name": "BaseBdev2", 00:17:56.694 "uuid": "7e90efe2-c7bb-5847-8635-3ae4971e97c0", 00:17:56.694 "is_configured": true, 00:17:56.694 "data_offset": 2048, 00:17:56.694 "data_size": 63488 00:17:56.694 }, 00:17:56.694 { 00:17:56.694 "name": "BaseBdev3", 00:17:56.694 "uuid": "f1a52dee-121d-5f38-8a83-80792841c931", 00:17:56.694 "is_configured": true, 00:17:56.694 "data_offset": 2048, 00:17:56.694 "data_size": 63488 00:17:56.694 } 00:17:56.694 ] 00:17:56.694 }' 00:17:56.694 15:54:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.694 15:54:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:57.264 [2024-07-12 15:54:17.598193] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:57.264 [2024-07-12 15:54:17.598219] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:57.264 [2024-07-12 15:54:17.600806] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:57.264 [2024-07-12 15:54:17.600828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.264 [2024-07-12 15:54:17.600885] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:57.264 [2024-07-12 15:54:17.600891] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a3750 name raid_bdev1, state offline 00:17:57.264 0 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2567077 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2567077 ']' 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2567077 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2567077 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2567077' 00:17:57.264 killing process with pid 2567077 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2567077 00:17:57.264 [2024-07-12 15:54:17.668674] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:57.264 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2567077 00:17:57.264 [2024-07-12 15:54:17.679672] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2LVK2kqLt2 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:57.524 00:17:57.524 real 0m5.507s 00:17:57.524 user 0m9.146s 00:17:57.524 sys 0m0.825s 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:57.524 15:54:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.525 ************************************ 00:17:57.525 END TEST raid_write_error_test 00:17:57.525 ************************************ 00:17:57.525 15:54:17 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:57.525 15:54:17 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:17:57.525 15:54:17 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:57.525 15:54:17 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:57.525 15:54:17 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:57.525 15:54:17 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:57.525 15:54:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:57.525 ************************************ 00:17:57.525 START TEST raid_state_function_test 00:17:57.525 ************************************ 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2568087 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2568087' 00:17:57.525 Process raid pid: 2568087 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2568087 /var/tmp/spdk-raid.sock 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2568087 ']' 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:57.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:57.525 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.525 [2024-07-12 15:54:17.952901] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:17:57.525 [2024-07-12 15:54:17.952955] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:57.784 [2024-07-12 15:54:18.042740] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:57.784 [2024-07-12 15:54:18.110629] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:57.784 [2024-07-12 15:54:18.155540] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:57.784 [2024-07-12 15:54:18.155562] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:58.353 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:58.353 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:58.353 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:58.612 [2024-07-12 15:54:18.966840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:58.612 [2024-07-12 15:54:18.966868] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:58.612 [2024-07-12 15:54:18.966875] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:58.612 [2024-07-12 15:54:18.966880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:58.612 [2024-07-12 15:54:18.966885] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:58.612 [2024-07-12 15:54:18.966890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:58.612 [2024-07-12 15:54:18.966895] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:58.612 [2024-07-12 15:54:18.966900] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.612 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.872 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.872 "name": "Existed_Raid", 00:17:58.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.872 "strip_size_kb": 64, 00:17:58.872 "state": "configuring", 00:17:58.872 "raid_level": "raid0", 00:17:58.872 "superblock": false, 00:17:58.872 "num_base_bdevs": 4, 00:17:58.872 "num_base_bdevs_discovered": 0, 00:17:58.872 "num_base_bdevs_operational": 4, 00:17:58.872 "base_bdevs_list": [ 00:17:58.872 { 00:17:58.872 "name": "BaseBdev1", 00:17:58.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.872 "is_configured": false, 00:17:58.872 "data_offset": 0, 00:17:58.872 "data_size": 0 00:17:58.872 }, 00:17:58.872 { 00:17:58.872 "name": "BaseBdev2", 00:17:58.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.872 "is_configured": false, 00:17:58.872 "data_offset": 0, 00:17:58.872 "data_size": 0 00:17:58.872 }, 00:17:58.872 { 00:17:58.872 "name": "BaseBdev3", 00:17:58.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.872 "is_configured": false, 00:17:58.872 "data_offset": 0, 00:17:58.872 "data_size": 0 00:17:58.872 }, 00:17:58.872 { 00:17:58.872 "name": "BaseBdev4", 00:17:58.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.872 "is_configured": false, 00:17:58.872 "data_offset": 0, 00:17:58.872 "data_size": 0 00:17:58.872 } 00:17:58.872 ] 00:17:58.872 }' 00:17:58.872 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.872 15:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.440 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:59.700 [2024-07-12 15:54:19.897077] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:59.700 [2024-07-12 15:54:19.897094] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0920 name Existed_Raid, state configuring 00:17:59.700 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:59.700 [2024-07-12 15:54:20.093974] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:59.700 [2024-07-12 15:54:20.093998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:59.700 [2024-07-12 15:54:20.094004] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:59.700 [2024-07-12 15:54:20.094009] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:59.700 [2024-07-12 15:54:20.094014] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:59.700 [2024-07-12 15:54:20.094019] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:59.700 [2024-07-12 15:54:20.094024] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:59.700 [2024-07-12 15:54:20.094030] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:59.700 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:59.960 [2024-07-12 15:54:20.296976] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:59.960 BaseBdev1 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:59.960 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:00.219 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:00.479 [ 00:18:00.479 { 00:18:00.479 "name": "BaseBdev1", 00:18:00.479 "aliases": [ 00:18:00.479 "892122c7-6602-4daf-aeb5-0cc7ff609d90" 00:18:00.479 ], 00:18:00.479 "product_name": "Malloc disk", 00:18:00.479 "block_size": 512, 00:18:00.479 "num_blocks": 65536, 00:18:00.479 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:00.479 "assigned_rate_limits": { 00:18:00.479 "rw_ios_per_sec": 0, 00:18:00.479 "rw_mbytes_per_sec": 0, 00:18:00.479 "r_mbytes_per_sec": 0, 00:18:00.479 "w_mbytes_per_sec": 0 00:18:00.479 }, 00:18:00.479 "claimed": true, 00:18:00.479 "claim_type": "exclusive_write", 00:18:00.479 "zoned": false, 00:18:00.479 "supported_io_types": { 00:18:00.479 "read": true, 00:18:00.479 "write": true, 00:18:00.479 "unmap": true, 00:18:00.479 "flush": true, 00:18:00.479 "reset": true, 00:18:00.479 "nvme_admin": false, 00:18:00.479 "nvme_io": false, 00:18:00.479 "nvme_io_md": false, 00:18:00.479 "write_zeroes": true, 00:18:00.479 "zcopy": true, 00:18:00.479 "get_zone_info": false, 00:18:00.479 "zone_management": false, 00:18:00.479 "zone_append": false, 00:18:00.479 "compare": false, 00:18:00.479 "compare_and_write": false, 00:18:00.479 "abort": true, 00:18:00.479 "seek_hole": false, 00:18:00.479 "seek_data": false, 00:18:00.479 "copy": true, 00:18:00.479 "nvme_iov_md": false 00:18:00.479 }, 00:18:00.479 "memory_domains": [ 00:18:00.479 { 00:18:00.479 "dma_device_id": "system", 00:18:00.479 "dma_device_type": 1 00:18:00.479 }, 00:18:00.479 { 00:18:00.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.479 "dma_device_type": 2 00:18:00.479 } 00:18:00.479 ], 00:18:00.479 "driver_specific": {} 00:18:00.479 } 00:18:00.479 ] 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.479 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.479 "name": "Existed_Raid", 00:18:00.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.479 "strip_size_kb": 64, 00:18:00.479 "state": "configuring", 00:18:00.479 "raid_level": "raid0", 00:18:00.479 "superblock": false, 00:18:00.479 "num_base_bdevs": 4, 00:18:00.479 "num_base_bdevs_discovered": 1, 00:18:00.479 "num_base_bdevs_operational": 4, 00:18:00.479 "base_bdevs_list": [ 00:18:00.479 { 00:18:00.479 "name": "BaseBdev1", 00:18:00.479 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:00.479 "is_configured": true, 00:18:00.479 "data_offset": 0, 00:18:00.480 "data_size": 65536 00:18:00.480 }, 00:18:00.480 { 00:18:00.480 "name": "BaseBdev2", 00:18:00.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.480 "is_configured": false, 00:18:00.480 "data_offset": 0, 00:18:00.480 "data_size": 0 00:18:00.480 }, 00:18:00.480 { 00:18:00.480 "name": "BaseBdev3", 00:18:00.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.480 "is_configured": false, 00:18:00.480 "data_offset": 0, 00:18:00.480 "data_size": 0 00:18:00.480 }, 00:18:00.480 { 00:18:00.480 "name": "BaseBdev4", 00:18:00.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.480 "is_configured": false, 00:18:00.480 "data_offset": 0, 00:18:00.480 "data_size": 0 00:18:00.480 } 00:18:00.480 ] 00:18:00.480 }' 00:18:00.480 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.480 15:54:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:01.050 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:01.310 [2024-07-12 15:54:21.608291] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:01.310 [2024-07-12 15:54:21.608318] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e0190 name Existed_Raid, state configuring 00:18:01.310 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:01.569 [2024-07-12 15:54:21.796799] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:01.569 [2024-07-12 15:54:21.797904] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:01.569 [2024-07-12 15:54:21.797925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:01.569 [2024-07-12 15:54:21.797931] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:01.569 [2024-07-12 15:54:21.797937] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:01.569 [2024-07-12 15:54:21.797942] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:01.569 [2024-07-12 15:54:21.797947] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.569 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.569 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.569 "name": "Existed_Raid", 00:18:01.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.569 "strip_size_kb": 64, 00:18:01.569 "state": "configuring", 00:18:01.569 "raid_level": "raid0", 00:18:01.569 "superblock": false, 00:18:01.569 "num_base_bdevs": 4, 00:18:01.569 "num_base_bdevs_discovered": 1, 00:18:01.569 "num_base_bdevs_operational": 4, 00:18:01.569 "base_bdevs_list": [ 00:18:01.569 { 00:18:01.569 "name": "BaseBdev1", 00:18:01.569 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:01.569 "is_configured": true, 00:18:01.569 "data_offset": 0, 00:18:01.569 "data_size": 65536 00:18:01.569 }, 00:18:01.569 { 00:18:01.569 "name": "BaseBdev2", 00:18:01.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.569 "is_configured": false, 00:18:01.569 "data_offset": 0, 00:18:01.569 "data_size": 0 00:18:01.569 }, 00:18:01.569 { 00:18:01.569 "name": "BaseBdev3", 00:18:01.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.569 "is_configured": false, 00:18:01.569 "data_offset": 0, 00:18:01.569 "data_size": 0 00:18:01.569 }, 00:18:01.569 { 00:18:01.569 "name": "BaseBdev4", 00:18:01.569 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.569 "is_configured": false, 00:18:01.569 "data_offset": 0, 00:18:01.569 "data_size": 0 00:18:01.569 } 00:18:01.569 ] 00:18:01.569 }' 00:18:01.569 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.569 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.139 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:02.398 [2024-07-12 15:54:22.740189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:02.398 BaseBdev2 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:02.398 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:02.658 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:02.917 [ 00:18:02.917 { 00:18:02.917 "name": "BaseBdev2", 00:18:02.917 "aliases": [ 00:18:02.917 "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d" 00:18:02.917 ], 00:18:02.917 "product_name": "Malloc disk", 00:18:02.917 "block_size": 512, 00:18:02.917 "num_blocks": 65536, 00:18:02.917 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:02.917 "assigned_rate_limits": { 00:18:02.917 "rw_ios_per_sec": 0, 00:18:02.918 "rw_mbytes_per_sec": 0, 00:18:02.918 "r_mbytes_per_sec": 0, 00:18:02.918 "w_mbytes_per_sec": 0 00:18:02.918 }, 00:18:02.918 "claimed": true, 00:18:02.918 "claim_type": "exclusive_write", 00:18:02.918 "zoned": false, 00:18:02.918 "supported_io_types": { 00:18:02.918 "read": true, 00:18:02.918 "write": true, 00:18:02.918 "unmap": true, 00:18:02.918 "flush": true, 00:18:02.918 "reset": true, 00:18:02.918 "nvme_admin": false, 00:18:02.918 "nvme_io": false, 00:18:02.918 "nvme_io_md": false, 00:18:02.918 "write_zeroes": true, 00:18:02.918 "zcopy": true, 00:18:02.918 "get_zone_info": false, 00:18:02.918 "zone_management": false, 00:18:02.918 "zone_append": false, 00:18:02.918 "compare": false, 00:18:02.918 "compare_and_write": false, 00:18:02.918 "abort": true, 00:18:02.918 "seek_hole": false, 00:18:02.918 "seek_data": false, 00:18:02.918 "copy": true, 00:18:02.918 "nvme_iov_md": false 00:18:02.918 }, 00:18:02.918 "memory_domains": [ 00:18:02.918 { 00:18:02.918 "dma_device_id": "system", 00:18:02.918 "dma_device_type": 1 00:18:02.918 }, 00:18:02.918 { 00:18:02.918 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.918 "dma_device_type": 2 00:18:02.918 } 00:18:02.918 ], 00:18:02.918 "driver_specific": {} 00:18:02.918 } 00:18:02.918 ] 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.918 "name": "Existed_Raid", 00:18:02.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.918 "strip_size_kb": 64, 00:18:02.918 "state": "configuring", 00:18:02.918 "raid_level": "raid0", 00:18:02.918 "superblock": false, 00:18:02.918 "num_base_bdevs": 4, 00:18:02.918 "num_base_bdevs_discovered": 2, 00:18:02.918 "num_base_bdevs_operational": 4, 00:18:02.918 "base_bdevs_list": [ 00:18:02.918 { 00:18:02.918 "name": "BaseBdev1", 00:18:02.918 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:02.918 "is_configured": true, 00:18:02.918 "data_offset": 0, 00:18:02.918 "data_size": 65536 00:18:02.918 }, 00:18:02.918 { 00:18:02.918 "name": "BaseBdev2", 00:18:02.918 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:02.918 "is_configured": true, 00:18:02.918 "data_offset": 0, 00:18:02.918 "data_size": 65536 00:18:02.918 }, 00:18:02.918 { 00:18:02.918 "name": "BaseBdev3", 00:18:02.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.918 "is_configured": false, 00:18:02.918 "data_offset": 0, 00:18:02.918 "data_size": 0 00:18:02.918 }, 00:18:02.918 { 00:18:02.918 "name": "BaseBdev4", 00:18:02.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.918 "is_configured": false, 00:18:02.918 "data_offset": 0, 00:18:02.918 "data_size": 0 00:18:02.918 } 00:18:02.918 ] 00:18:02.918 }' 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.918 15:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.488 15:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:03.748 [2024-07-12 15:54:24.024266] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:03.748 BaseBdev3 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:03.748 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:04.009 [ 00:18:04.009 { 00:18:04.009 "name": "BaseBdev3", 00:18:04.009 "aliases": [ 00:18:04.009 "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6" 00:18:04.009 ], 00:18:04.009 "product_name": "Malloc disk", 00:18:04.009 "block_size": 512, 00:18:04.009 "num_blocks": 65536, 00:18:04.009 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:04.009 "assigned_rate_limits": { 00:18:04.009 "rw_ios_per_sec": 0, 00:18:04.009 "rw_mbytes_per_sec": 0, 00:18:04.009 "r_mbytes_per_sec": 0, 00:18:04.009 "w_mbytes_per_sec": 0 00:18:04.009 }, 00:18:04.009 "claimed": true, 00:18:04.009 "claim_type": "exclusive_write", 00:18:04.009 "zoned": false, 00:18:04.009 "supported_io_types": { 00:18:04.009 "read": true, 00:18:04.009 "write": true, 00:18:04.009 "unmap": true, 00:18:04.009 "flush": true, 00:18:04.009 "reset": true, 00:18:04.009 "nvme_admin": false, 00:18:04.009 "nvme_io": false, 00:18:04.009 "nvme_io_md": false, 00:18:04.009 "write_zeroes": true, 00:18:04.009 "zcopy": true, 00:18:04.009 "get_zone_info": false, 00:18:04.009 "zone_management": false, 00:18:04.009 "zone_append": false, 00:18:04.009 "compare": false, 00:18:04.009 "compare_and_write": false, 00:18:04.009 "abort": true, 00:18:04.009 "seek_hole": false, 00:18:04.009 "seek_data": false, 00:18:04.009 "copy": true, 00:18:04.009 "nvme_iov_md": false 00:18:04.009 }, 00:18:04.009 "memory_domains": [ 00:18:04.009 { 00:18:04.009 "dma_device_id": "system", 00:18:04.009 "dma_device_type": 1 00:18:04.009 }, 00:18:04.009 { 00:18:04.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.009 "dma_device_type": 2 00:18:04.009 } 00:18:04.009 ], 00:18:04.009 "driver_specific": {} 00:18:04.009 } 00:18:04.009 ] 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.009 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.270 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.270 "name": "Existed_Raid", 00:18:04.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.270 "strip_size_kb": 64, 00:18:04.270 "state": "configuring", 00:18:04.270 "raid_level": "raid0", 00:18:04.270 "superblock": false, 00:18:04.270 "num_base_bdevs": 4, 00:18:04.270 "num_base_bdevs_discovered": 3, 00:18:04.270 "num_base_bdevs_operational": 4, 00:18:04.270 "base_bdevs_list": [ 00:18:04.270 { 00:18:04.270 "name": "BaseBdev1", 00:18:04.270 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:04.270 "is_configured": true, 00:18:04.270 "data_offset": 0, 00:18:04.270 "data_size": 65536 00:18:04.270 }, 00:18:04.270 { 00:18:04.270 "name": "BaseBdev2", 00:18:04.270 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:04.270 "is_configured": true, 00:18:04.270 "data_offset": 0, 00:18:04.270 "data_size": 65536 00:18:04.270 }, 00:18:04.270 { 00:18:04.270 "name": "BaseBdev3", 00:18:04.270 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:04.270 "is_configured": true, 00:18:04.270 "data_offset": 0, 00:18:04.270 "data_size": 65536 00:18:04.270 }, 00:18:04.270 { 00:18:04.270 "name": "BaseBdev4", 00:18:04.270 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.270 "is_configured": false, 00:18:04.270 "data_offset": 0, 00:18:04.270 "data_size": 0 00:18:04.270 } 00:18:04.270 ] 00:18:04.270 }' 00:18:04.270 15:54:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.270 15:54:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.839 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:05.099 [2024-07-12 15:54:25.348604] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:05.099 [2024-07-12 15:54:25.348627] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e11d0 00:18:05.099 [2024-07-12 15:54:25.348631] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:05.099 [2024-07-12 15:54:25.348819] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e2220 00:18:05.099 [2024-07-12 15:54:25.348913] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e11d0 00:18:05.099 [2024-07-12 15:54:25.348919] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e11d0 00:18:05.099 [2024-07-12 15:54:25.349036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:05.099 BaseBdev4 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:05.099 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:05.362 [ 00:18:05.362 { 00:18:05.362 "name": "BaseBdev4", 00:18:05.362 "aliases": [ 00:18:05.362 "e36f0c03-e0d7-4b69-a673-4a8262accd9d" 00:18:05.362 ], 00:18:05.362 "product_name": "Malloc disk", 00:18:05.362 "block_size": 512, 00:18:05.362 "num_blocks": 65536, 00:18:05.362 "uuid": "e36f0c03-e0d7-4b69-a673-4a8262accd9d", 00:18:05.362 "assigned_rate_limits": { 00:18:05.362 "rw_ios_per_sec": 0, 00:18:05.362 "rw_mbytes_per_sec": 0, 00:18:05.362 "r_mbytes_per_sec": 0, 00:18:05.362 "w_mbytes_per_sec": 0 00:18:05.362 }, 00:18:05.362 "claimed": true, 00:18:05.362 "claim_type": "exclusive_write", 00:18:05.362 "zoned": false, 00:18:05.362 "supported_io_types": { 00:18:05.362 "read": true, 00:18:05.362 "write": true, 00:18:05.362 "unmap": true, 00:18:05.362 "flush": true, 00:18:05.362 "reset": true, 00:18:05.362 "nvme_admin": false, 00:18:05.362 "nvme_io": false, 00:18:05.362 "nvme_io_md": false, 00:18:05.362 "write_zeroes": true, 00:18:05.362 "zcopy": true, 00:18:05.362 "get_zone_info": false, 00:18:05.362 "zone_management": false, 00:18:05.362 "zone_append": false, 00:18:05.362 "compare": false, 00:18:05.362 "compare_and_write": false, 00:18:05.362 "abort": true, 00:18:05.362 "seek_hole": false, 00:18:05.362 "seek_data": false, 00:18:05.362 "copy": true, 00:18:05.362 "nvme_iov_md": false 00:18:05.362 }, 00:18:05.362 "memory_domains": [ 00:18:05.362 { 00:18:05.362 "dma_device_id": "system", 00:18:05.362 "dma_device_type": 1 00:18:05.362 }, 00:18:05.362 { 00:18:05.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.362 "dma_device_type": 2 00:18:05.362 } 00:18:05.362 ], 00:18:05.362 "driver_specific": {} 00:18:05.362 } 00:18:05.362 ] 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.362 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.658 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.658 "name": "Existed_Raid", 00:18:05.658 "uuid": "2775281d-3b48-4c38-817a-9d71b893e853", 00:18:05.658 "strip_size_kb": 64, 00:18:05.658 "state": "online", 00:18:05.658 "raid_level": "raid0", 00:18:05.658 "superblock": false, 00:18:05.658 "num_base_bdevs": 4, 00:18:05.658 "num_base_bdevs_discovered": 4, 00:18:05.658 "num_base_bdevs_operational": 4, 00:18:05.658 "base_bdevs_list": [ 00:18:05.658 { 00:18:05.658 "name": "BaseBdev1", 00:18:05.658 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:05.658 "is_configured": true, 00:18:05.658 "data_offset": 0, 00:18:05.658 "data_size": 65536 00:18:05.658 }, 00:18:05.658 { 00:18:05.658 "name": "BaseBdev2", 00:18:05.658 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:05.658 "is_configured": true, 00:18:05.658 "data_offset": 0, 00:18:05.658 "data_size": 65536 00:18:05.658 }, 00:18:05.658 { 00:18:05.658 "name": "BaseBdev3", 00:18:05.658 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:05.658 "is_configured": true, 00:18:05.658 "data_offset": 0, 00:18:05.658 "data_size": 65536 00:18:05.658 }, 00:18:05.658 { 00:18:05.658 "name": "BaseBdev4", 00:18:05.658 "uuid": "e36f0c03-e0d7-4b69-a673-4a8262accd9d", 00:18:05.658 "is_configured": true, 00:18:05.658 "data_offset": 0, 00:18:05.658 "data_size": 65536 00:18:05.658 } 00:18:05.658 ] 00:18:05.658 }' 00:18:05.658 15:54:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.658 15:54:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:06.228 [2024-07-12 15:54:26.620108] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:06.228 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:06.228 "name": "Existed_Raid", 00:18:06.228 "aliases": [ 00:18:06.229 "2775281d-3b48-4c38-817a-9d71b893e853" 00:18:06.229 ], 00:18:06.229 "product_name": "Raid Volume", 00:18:06.229 "block_size": 512, 00:18:06.229 "num_blocks": 262144, 00:18:06.229 "uuid": "2775281d-3b48-4c38-817a-9d71b893e853", 00:18:06.229 "assigned_rate_limits": { 00:18:06.229 "rw_ios_per_sec": 0, 00:18:06.229 "rw_mbytes_per_sec": 0, 00:18:06.229 "r_mbytes_per_sec": 0, 00:18:06.229 "w_mbytes_per_sec": 0 00:18:06.229 }, 00:18:06.229 "claimed": false, 00:18:06.229 "zoned": false, 00:18:06.229 "supported_io_types": { 00:18:06.229 "read": true, 00:18:06.229 "write": true, 00:18:06.229 "unmap": true, 00:18:06.229 "flush": true, 00:18:06.229 "reset": true, 00:18:06.229 "nvme_admin": false, 00:18:06.229 "nvme_io": false, 00:18:06.229 "nvme_io_md": false, 00:18:06.229 "write_zeroes": true, 00:18:06.229 "zcopy": false, 00:18:06.229 "get_zone_info": false, 00:18:06.229 "zone_management": false, 00:18:06.229 "zone_append": false, 00:18:06.229 "compare": false, 00:18:06.229 "compare_and_write": false, 00:18:06.229 "abort": false, 00:18:06.229 "seek_hole": false, 00:18:06.229 "seek_data": false, 00:18:06.229 "copy": false, 00:18:06.229 "nvme_iov_md": false 00:18:06.229 }, 00:18:06.229 "memory_domains": [ 00:18:06.229 { 00:18:06.229 "dma_device_id": "system", 00:18:06.229 "dma_device_type": 1 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.229 "dma_device_type": 2 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "system", 00:18:06.229 "dma_device_type": 1 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.229 "dma_device_type": 2 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "system", 00:18:06.229 "dma_device_type": 1 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.229 "dma_device_type": 2 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "system", 00:18:06.229 "dma_device_type": 1 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.229 "dma_device_type": 2 00:18:06.229 } 00:18:06.229 ], 00:18:06.229 "driver_specific": { 00:18:06.229 "raid": { 00:18:06.229 "uuid": "2775281d-3b48-4c38-817a-9d71b893e853", 00:18:06.229 "strip_size_kb": 64, 00:18:06.229 "state": "online", 00:18:06.229 "raid_level": "raid0", 00:18:06.229 "superblock": false, 00:18:06.229 "num_base_bdevs": 4, 00:18:06.229 "num_base_bdevs_discovered": 4, 00:18:06.229 "num_base_bdevs_operational": 4, 00:18:06.229 "base_bdevs_list": [ 00:18:06.229 { 00:18:06.229 "name": "BaseBdev1", 00:18:06.229 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:06.229 "is_configured": true, 00:18:06.229 "data_offset": 0, 00:18:06.229 "data_size": 65536 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "name": "BaseBdev2", 00:18:06.229 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:06.229 "is_configured": true, 00:18:06.229 "data_offset": 0, 00:18:06.229 "data_size": 65536 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "name": "BaseBdev3", 00:18:06.229 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:06.229 "is_configured": true, 00:18:06.229 "data_offset": 0, 00:18:06.229 "data_size": 65536 00:18:06.229 }, 00:18:06.229 { 00:18:06.229 "name": "BaseBdev4", 00:18:06.229 "uuid": "e36f0c03-e0d7-4b69-a673-4a8262accd9d", 00:18:06.229 "is_configured": true, 00:18:06.229 "data_offset": 0, 00:18:06.229 "data_size": 65536 00:18:06.229 } 00:18:06.229 ] 00:18:06.229 } 00:18:06.229 } 00:18:06.229 }' 00:18:06.229 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:06.489 BaseBdev2 00:18:06.489 BaseBdev3 00:18:06.489 BaseBdev4' 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:06.489 "name": "BaseBdev1", 00:18:06.489 "aliases": [ 00:18:06.489 "892122c7-6602-4daf-aeb5-0cc7ff609d90" 00:18:06.489 ], 00:18:06.489 "product_name": "Malloc disk", 00:18:06.489 "block_size": 512, 00:18:06.489 "num_blocks": 65536, 00:18:06.489 "uuid": "892122c7-6602-4daf-aeb5-0cc7ff609d90", 00:18:06.489 "assigned_rate_limits": { 00:18:06.489 "rw_ios_per_sec": 0, 00:18:06.489 "rw_mbytes_per_sec": 0, 00:18:06.489 "r_mbytes_per_sec": 0, 00:18:06.489 "w_mbytes_per_sec": 0 00:18:06.489 }, 00:18:06.489 "claimed": true, 00:18:06.489 "claim_type": "exclusive_write", 00:18:06.489 "zoned": false, 00:18:06.489 "supported_io_types": { 00:18:06.489 "read": true, 00:18:06.489 "write": true, 00:18:06.489 "unmap": true, 00:18:06.489 "flush": true, 00:18:06.489 "reset": true, 00:18:06.489 "nvme_admin": false, 00:18:06.489 "nvme_io": false, 00:18:06.489 "nvme_io_md": false, 00:18:06.489 "write_zeroes": true, 00:18:06.489 "zcopy": true, 00:18:06.489 "get_zone_info": false, 00:18:06.489 "zone_management": false, 00:18:06.489 "zone_append": false, 00:18:06.489 "compare": false, 00:18:06.489 "compare_and_write": false, 00:18:06.489 "abort": true, 00:18:06.489 "seek_hole": false, 00:18:06.489 "seek_data": false, 00:18:06.489 "copy": true, 00:18:06.489 "nvme_iov_md": false 00:18:06.489 }, 00:18:06.489 "memory_domains": [ 00:18:06.489 { 00:18:06.489 "dma_device_id": "system", 00:18:06.489 "dma_device_type": 1 00:18:06.489 }, 00:18:06.489 { 00:18:06.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.489 "dma_device_type": 2 00:18:06.489 } 00:18:06.489 ], 00:18:06.489 "driver_specific": {} 00:18:06.489 }' 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.489 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:06.749 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:06.749 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.749 15:54:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:06.749 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:07.008 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:07.009 "name": "BaseBdev2", 00:18:07.009 "aliases": [ 00:18:07.009 "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d" 00:18:07.009 ], 00:18:07.009 "product_name": "Malloc disk", 00:18:07.009 "block_size": 512, 00:18:07.009 "num_blocks": 65536, 00:18:07.009 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:07.009 "assigned_rate_limits": { 00:18:07.009 "rw_ios_per_sec": 0, 00:18:07.009 "rw_mbytes_per_sec": 0, 00:18:07.009 "r_mbytes_per_sec": 0, 00:18:07.009 "w_mbytes_per_sec": 0 00:18:07.009 }, 00:18:07.009 "claimed": true, 00:18:07.009 "claim_type": "exclusive_write", 00:18:07.009 "zoned": false, 00:18:07.009 "supported_io_types": { 00:18:07.009 "read": true, 00:18:07.009 "write": true, 00:18:07.009 "unmap": true, 00:18:07.009 "flush": true, 00:18:07.009 "reset": true, 00:18:07.009 "nvme_admin": false, 00:18:07.009 "nvme_io": false, 00:18:07.009 "nvme_io_md": false, 00:18:07.009 "write_zeroes": true, 00:18:07.009 "zcopy": true, 00:18:07.009 "get_zone_info": false, 00:18:07.009 "zone_management": false, 00:18:07.009 "zone_append": false, 00:18:07.009 "compare": false, 00:18:07.009 "compare_and_write": false, 00:18:07.009 "abort": true, 00:18:07.009 "seek_hole": false, 00:18:07.009 "seek_data": false, 00:18:07.009 "copy": true, 00:18:07.009 "nvme_iov_md": false 00:18:07.009 }, 00:18:07.009 "memory_domains": [ 00:18:07.009 { 00:18:07.009 "dma_device_id": "system", 00:18:07.009 "dma_device_type": 1 00:18:07.009 }, 00:18:07.009 { 00:18:07.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.009 "dma_device_type": 2 00:18:07.009 } 00:18:07.009 ], 00:18:07.009 "driver_specific": {} 00:18:07.009 }' 00:18:07.009 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.009 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.268 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:07.529 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:07.530 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:07.530 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:07.530 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:07.530 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:07.530 "name": "BaseBdev3", 00:18:07.530 "aliases": [ 00:18:07.530 "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6" 00:18:07.530 ], 00:18:07.530 "product_name": "Malloc disk", 00:18:07.530 "block_size": 512, 00:18:07.530 "num_blocks": 65536, 00:18:07.530 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:07.530 "assigned_rate_limits": { 00:18:07.530 "rw_ios_per_sec": 0, 00:18:07.530 "rw_mbytes_per_sec": 0, 00:18:07.530 "r_mbytes_per_sec": 0, 00:18:07.530 "w_mbytes_per_sec": 0 00:18:07.530 }, 00:18:07.530 "claimed": true, 00:18:07.530 "claim_type": "exclusive_write", 00:18:07.530 "zoned": false, 00:18:07.530 "supported_io_types": { 00:18:07.530 "read": true, 00:18:07.530 "write": true, 00:18:07.530 "unmap": true, 00:18:07.530 "flush": true, 00:18:07.530 "reset": true, 00:18:07.530 "nvme_admin": false, 00:18:07.530 "nvme_io": false, 00:18:07.530 "nvme_io_md": false, 00:18:07.530 "write_zeroes": true, 00:18:07.530 "zcopy": true, 00:18:07.530 "get_zone_info": false, 00:18:07.530 "zone_management": false, 00:18:07.530 "zone_append": false, 00:18:07.530 "compare": false, 00:18:07.530 "compare_and_write": false, 00:18:07.530 "abort": true, 00:18:07.530 "seek_hole": false, 00:18:07.530 "seek_data": false, 00:18:07.530 "copy": true, 00:18:07.530 "nvme_iov_md": false 00:18:07.530 }, 00:18:07.530 "memory_domains": [ 00:18:07.530 { 00:18:07.530 "dma_device_id": "system", 00:18:07.530 "dma_device_type": 1 00:18:07.530 }, 00:18:07.530 { 00:18:07.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.530 "dma_device_type": 2 00:18:07.530 } 00:18:07.530 ], 00:18:07.530 "driver_specific": {} 00:18:07.530 }' 00:18:07.530 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.791 15:54:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:07.791 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:08.051 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:08.051 "name": "BaseBdev4", 00:18:08.051 "aliases": [ 00:18:08.051 "e36f0c03-e0d7-4b69-a673-4a8262accd9d" 00:18:08.051 ], 00:18:08.051 "product_name": "Malloc disk", 00:18:08.051 "block_size": 512, 00:18:08.051 "num_blocks": 65536, 00:18:08.051 "uuid": "e36f0c03-e0d7-4b69-a673-4a8262accd9d", 00:18:08.051 "assigned_rate_limits": { 00:18:08.051 "rw_ios_per_sec": 0, 00:18:08.051 "rw_mbytes_per_sec": 0, 00:18:08.051 "r_mbytes_per_sec": 0, 00:18:08.051 "w_mbytes_per_sec": 0 00:18:08.051 }, 00:18:08.051 "claimed": true, 00:18:08.051 "claim_type": "exclusive_write", 00:18:08.051 "zoned": false, 00:18:08.051 "supported_io_types": { 00:18:08.051 "read": true, 00:18:08.051 "write": true, 00:18:08.051 "unmap": true, 00:18:08.051 "flush": true, 00:18:08.051 "reset": true, 00:18:08.051 "nvme_admin": false, 00:18:08.051 "nvme_io": false, 00:18:08.051 "nvme_io_md": false, 00:18:08.051 "write_zeroes": true, 00:18:08.051 "zcopy": true, 00:18:08.051 "get_zone_info": false, 00:18:08.051 "zone_management": false, 00:18:08.052 "zone_append": false, 00:18:08.052 "compare": false, 00:18:08.052 "compare_and_write": false, 00:18:08.052 "abort": true, 00:18:08.052 "seek_hole": false, 00:18:08.052 "seek_data": false, 00:18:08.052 "copy": true, 00:18:08.052 "nvme_iov_md": false 00:18:08.052 }, 00:18:08.052 "memory_domains": [ 00:18:08.052 { 00:18:08.052 "dma_device_id": "system", 00:18:08.052 "dma_device_type": 1 00:18:08.052 }, 00:18:08.052 { 00:18:08.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.052 "dma_device_type": 2 00:18:08.052 } 00:18:08.052 ], 00:18:08.052 "driver_specific": {} 00:18:08.052 }' 00:18:08.052 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:08.310 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:08.568 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:08.568 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:08.568 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:08.568 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:08.568 15:54:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:08.826 [2024-07-12 15:54:29.034007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:08.826 [2024-07-12 15:54:29.034024] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:08.826 [2024-07-12 15:54:29.034060] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.826 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.827 "name": "Existed_Raid", 00:18:08.827 "uuid": "2775281d-3b48-4c38-817a-9d71b893e853", 00:18:08.827 "strip_size_kb": 64, 00:18:08.827 "state": "offline", 00:18:08.827 "raid_level": "raid0", 00:18:08.827 "superblock": false, 00:18:08.827 "num_base_bdevs": 4, 00:18:08.827 "num_base_bdevs_discovered": 3, 00:18:08.827 "num_base_bdevs_operational": 3, 00:18:08.827 "base_bdevs_list": [ 00:18:08.827 { 00:18:08.827 "name": null, 00:18:08.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.827 "is_configured": false, 00:18:08.827 "data_offset": 0, 00:18:08.827 "data_size": 65536 00:18:08.827 }, 00:18:08.827 { 00:18:08.827 "name": "BaseBdev2", 00:18:08.827 "uuid": "f4c37ddf-5db2-4b14-bf90-e4bf28fe8f2d", 00:18:08.827 "is_configured": true, 00:18:08.827 "data_offset": 0, 00:18:08.827 "data_size": 65536 00:18:08.827 }, 00:18:08.827 { 00:18:08.827 "name": "BaseBdev3", 00:18:08.827 "uuid": "c6221d9d-8e2f-4c4e-9eb3-d689b496a6f6", 00:18:08.827 "is_configured": true, 00:18:08.827 "data_offset": 0, 00:18:08.827 "data_size": 65536 00:18:08.827 }, 00:18:08.827 { 00:18:08.827 "name": "BaseBdev4", 00:18:08.827 "uuid": "e36f0c03-e0d7-4b69-a673-4a8262accd9d", 00:18:08.827 "is_configured": true, 00:18:08.827 "data_offset": 0, 00:18:08.827 "data_size": 65536 00:18:08.827 } 00:18:08.827 ] 00:18:08.827 }' 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.827 15:54:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:09.395 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:09.395 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:09.395 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:09.395 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.654 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:09.654 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:09.654 15:54:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:09.913 [2024-07-12 15:54:30.168886] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:09.913 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:09.913 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:09.913 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.913 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:10.172 [2024-07-12 15:54:30.555526] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.172 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:10.431 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:10.431 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:10.431 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:10.691 [2024-07-12 15:54:30.930272] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:10.691 [2024-07-12 15:54:30.930300] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e11d0 name Existed_Raid, state offline 00:18:10.691 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:10.691 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:10.691 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:10.691 15:54:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:10.951 BaseBdev2 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:10.951 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.210 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:11.470 [ 00:18:11.470 { 00:18:11.470 "name": "BaseBdev2", 00:18:11.470 "aliases": [ 00:18:11.470 "c8e4cc49-18ee-4228-a30b-a22ae241d6f8" 00:18:11.470 ], 00:18:11.470 "product_name": "Malloc disk", 00:18:11.470 "block_size": 512, 00:18:11.470 "num_blocks": 65536, 00:18:11.470 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:11.470 "assigned_rate_limits": { 00:18:11.470 "rw_ios_per_sec": 0, 00:18:11.470 "rw_mbytes_per_sec": 0, 00:18:11.470 "r_mbytes_per_sec": 0, 00:18:11.470 "w_mbytes_per_sec": 0 00:18:11.470 }, 00:18:11.470 "claimed": false, 00:18:11.470 "zoned": false, 00:18:11.470 "supported_io_types": { 00:18:11.470 "read": true, 00:18:11.470 "write": true, 00:18:11.470 "unmap": true, 00:18:11.470 "flush": true, 00:18:11.470 "reset": true, 00:18:11.470 "nvme_admin": false, 00:18:11.470 "nvme_io": false, 00:18:11.470 "nvme_io_md": false, 00:18:11.470 "write_zeroes": true, 00:18:11.470 "zcopy": true, 00:18:11.470 "get_zone_info": false, 00:18:11.470 "zone_management": false, 00:18:11.470 "zone_append": false, 00:18:11.470 "compare": false, 00:18:11.470 "compare_and_write": false, 00:18:11.470 "abort": true, 00:18:11.470 "seek_hole": false, 00:18:11.470 "seek_data": false, 00:18:11.470 "copy": true, 00:18:11.470 "nvme_iov_md": false 00:18:11.470 }, 00:18:11.470 "memory_domains": [ 00:18:11.470 { 00:18:11.470 "dma_device_id": "system", 00:18:11.470 "dma_device_type": 1 00:18:11.470 }, 00:18:11.470 { 00:18:11.470 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.470 "dma_device_type": 2 00:18:11.470 } 00:18:11.470 ], 00:18:11.470 "driver_specific": {} 00:18:11.470 } 00:18:11.470 ] 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:11.470 BaseBdev3 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:11.470 15:54:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:11.730 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:11.990 [ 00:18:11.990 { 00:18:11.990 "name": "BaseBdev3", 00:18:11.990 "aliases": [ 00:18:11.990 "7958be4f-9b22-455c-8b13-01dafa7d6d5e" 00:18:11.990 ], 00:18:11.990 "product_name": "Malloc disk", 00:18:11.990 "block_size": 512, 00:18:11.990 "num_blocks": 65536, 00:18:11.990 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:11.990 "assigned_rate_limits": { 00:18:11.990 "rw_ios_per_sec": 0, 00:18:11.990 "rw_mbytes_per_sec": 0, 00:18:11.990 "r_mbytes_per_sec": 0, 00:18:11.990 "w_mbytes_per_sec": 0 00:18:11.990 }, 00:18:11.990 "claimed": false, 00:18:11.990 "zoned": false, 00:18:11.990 "supported_io_types": { 00:18:11.990 "read": true, 00:18:11.990 "write": true, 00:18:11.990 "unmap": true, 00:18:11.990 "flush": true, 00:18:11.990 "reset": true, 00:18:11.990 "nvme_admin": false, 00:18:11.990 "nvme_io": false, 00:18:11.990 "nvme_io_md": false, 00:18:11.990 "write_zeroes": true, 00:18:11.990 "zcopy": true, 00:18:11.990 "get_zone_info": false, 00:18:11.990 "zone_management": false, 00:18:11.990 "zone_append": false, 00:18:11.990 "compare": false, 00:18:11.990 "compare_and_write": false, 00:18:11.990 "abort": true, 00:18:11.990 "seek_hole": false, 00:18:11.990 "seek_data": false, 00:18:11.990 "copy": true, 00:18:11.990 "nvme_iov_md": false 00:18:11.990 }, 00:18:11.990 "memory_domains": [ 00:18:11.990 { 00:18:11.990 "dma_device_id": "system", 00:18:11.990 "dma_device_type": 1 00:18:11.990 }, 00:18:11.990 { 00:18:11.990 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.990 "dma_device_type": 2 00:18:11.990 } 00:18:11.990 ], 00:18:11.990 "driver_specific": {} 00:18:11.990 } 00:18:11.990 ] 00:18:11.990 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:11.990 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:11.990 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:11.990 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:12.249 BaseBdev4 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:12.249 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:12.507 [ 00:18:12.507 { 00:18:12.507 "name": "BaseBdev4", 00:18:12.507 "aliases": [ 00:18:12.508 "8e1df85e-50c5-4547-aa34-6431314fb84f" 00:18:12.508 ], 00:18:12.508 "product_name": "Malloc disk", 00:18:12.508 "block_size": 512, 00:18:12.508 "num_blocks": 65536, 00:18:12.508 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:12.508 "assigned_rate_limits": { 00:18:12.508 "rw_ios_per_sec": 0, 00:18:12.508 "rw_mbytes_per_sec": 0, 00:18:12.508 "r_mbytes_per_sec": 0, 00:18:12.508 "w_mbytes_per_sec": 0 00:18:12.508 }, 00:18:12.508 "claimed": false, 00:18:12.508 "zoned": false, 00:18:12.508 "supported_io_types": { 00:18:12.508 "read": true, 00:18:12.508 "write": true, 00:18:12.508 "unmap": true, 00:18:12.508 "flush": true, 00:18:12.508 "reset": true, 00:18:12.508 "nvme_admin": false, 00:18:12.508 "nvme_io": false, 00:18:12.508 "nvme_io_md": false, 00:18:12.508 "write_zeroes": true, 00:18:12.508 "zcopy": true, 00:18:12.508 "get_zone_info": false, 00:18:12.508 "zone_management": false, 00:18:12.508 "zone_append": false, 00:18:12.508 "compare": false, 00:18:12.508 "compare_and_write": false, 00:18:12.508 "abort": true, 00:18:12.508 "seek_hole": false, 00:18:12.508 "seek_data": false, 00:18:12.508 "copy": true, 00:18:12.508 "nvme_iov_md": false 00:18:12.508 }, 00:18:12.508 "memory_domains": [ 00:18:12.508 { 00:18:12.508 "dma_device_id": "system", 00:18:12.508 "dma_device_type": 1 00:18:12.508 }, 00:18:12.508 { 00:18:12.508 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.508 "dma_device_type": 2 00:18:12.508 } 00:18:12.508 ], 00:18:12.508 "driver_specific": {} 00:18:12.508 } 00:18:12.508 ] 00:18:12.508 15:54:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:12.508 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:12.508 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:12.508 15:54:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:12.766 [2024-07-12 15:54:33.017286] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:12.766 [2024-07-12 15:54:33.017315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:12.766 [2024-07-12 15:54:33.017328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:12.766 [2024-07-12 15:54:33.018344] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:12.766 [2024-07-12 15:54:33.018374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.766 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.040 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.040 "name": "Existed_Raid", 00:18:13.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.040 "strip_size_kb": 64, 00:18:13.040 "state": "configuring", 00:18:13.040 "raid_level": "raid0", 00:18:13.040 "superblock": false, 00:18:13.040 "num_base_bdevs": 4, 00:18:13.040 "num_base_bdevs_discovered": 3, 00:18:13.040 "num_base_bdevs_operational": 4, 00:18:13.040 "base_bdevs_list": [ 00:18:13.040 { 00:18:13.040 "name": "BaseBdev1", 00:18:13.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.040 "is_configured": false, 00:18:13.040 "data_offset": 0, 00:18:13.040 "data_size": 0 00:18:13.040 }, 00:18:13.040 { 00:18:13.040 "name": "BaseBdev2", 00:18:13.040 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:13.040 "is_configured": true, 00:18:13.040 "data_offset": 0, 00:18:13.040 "data_size": 65536 00:18:13.040 }, 00:18:13.040 { 00:18:13.040 "name": "BaseBdev3", 00:18:13.040 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:13.040 "is_configured": true, 00:18:13.040 "data_offset": 0, 00:18:13.040 "data_size": 65536 00:18:13.040 }, 00:18:13.040 { 00:18:13.040 "name": "BaseBdev4", 00:18:13.040 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:13.040 "is_configured": true, 00:18:13.040 "data_offset": 0, 00:18:13.040 "data_size": 65536 00:18:13.040 } 00:18:13.040 ] 00:18:13.040 }' 00:18:13.040 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.040 15:54:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:13.608 [2024-07-12 15:54:33.935574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.608 15:54:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.867 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.867 "name": "Existed_Raid", 00:18:13.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.867 "strip_size_kb": 64, 00:18:13.867 "state": "configuring", 00:18:13.867 "raid_level": "raid0", 00:18:13.867 "superblock": false, 00:18:13.867 "num_base_bdevs": 4, 00:18:13.867 "num_base_bdevs_discovered": 2, 00:18:13.867 "num_base_bdevs_operational": 4, 00:18:13.867 "base_bdevs_list": [ 00:18:13.867 { 00:18:13.867 "name": "BaseBdev1", 00:18:13.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:13.867 "is_configured": false, 00:18:13.867 "data_offset": 0, 00:18:13.867 "data_size": 0 00:18:13.867 }, 00:18:13.867 { 00:18:13.867 "name": null, 00:18:13.867 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:13.867 "is_configured": false, 00:18:13.867 "data_offset": 0, 00:18:13.867 "data_size": 65536 00:18:13.867 }, 00:18:13.867 { 00:18:13.867 "name": "BaseBdev3", 00:18:13.867 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:13.867 "is_configured": true, 00:18:13.867 "data_offset": 0, 00:18:13.867 "data_size": 65536 00:18:13.867 }, 00:18:13.867 { 00:18:13.867 "name": "BaseBdev4", 00:18:13.867 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:13.867 "is_configured": true, 00:18:13.867 "data_offset": 0, 00:18:13.867 "data_size": 65536 00:18:13.867 } 00:18:13.867 ] 00:18:13.867 }' 00:18:13.867 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.867 15:54:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:14.435 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.435 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:14.435 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:14.435 15:54:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:14.694 [2024-07-12 15:54:35.031246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:14.694 BaseBdev1 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:14.694 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.954 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:15.214 [ 00:18:15.214 { 00:18:15.214 "name": "BaseBdev1", 00:18:15.214 "aliases": [ 00:18:15.214 "e462b9b9-1dcf-44d1-9355-608cec31ec0d" 00:18:15.214 ], 00:18:15.214 "product_name": "Malloc disk", 00:18:15.214 "block_size": 512, 00:18:15.214 "num_blocks": 65536, 00:18:15.214 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:15.214 "assigned_rate_limits": { 00:18:15.214 "rw_ios_per_sec": 0, 00:18:15.214 "rw_mbytes_per_sec": 0, 00:18:15.214 "r_mbytes_per_sec": 0, 00:18:15.214 "w_mbytes_per_sec": 0 00:18:15.214 }, 00:18:15.214 "claimed": true, 00:18:15.214 "claim_type": "exclusive_write", 00:18:15.214 "zoned": false, 00:18:15.214 "supported_io_types": { 00:18:15.214 "read": true, 00:18:15.214 "write": true, 00:18:15.214 "unmap": true, 00:18:15.214 "flush": true, 00:18:15.214 "reset": true, 00:18:15.214 "nvme_admin": false, 00:18:15.214 "nvme_io": false, 00:18:15.214 "nvme_io_md": false, 00:18:15.214 "write_zeroes": true, 00:18:15.214 "zcopy": true, 00:18:15.214 "get_zone_info": false, 00:18:15.214 "zone_management": false, 00:18:15.214 "zone_append": false, 00:18:15.214 "compare": false, 00:18:15.214 "compare_and_write": false, 00:18:15.214 "abort": true, 00:18:15.214 "seek_hole": false, 00:18:15.214 "seek_data": false, 00:18:15.214 "copy": true, 00:18:15.214 "nvme_iov_md": false 00:18:15.214 }, 00:18:15.214 "memory_domains": [ 00:18:15.214 { 00:18:15.214 "dma_device_id": "system", 00:18:15.214 "dma_device_type": 1 00:18:15.214 }, 00:18:15.214 { 00:18:15.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.214 "dma_device_type": 2 00:18:15.214 } 00:18:15.214 ], 00:18:15.214 "driver_specific": {} 00:18:15.214 } 00:18:15.214 ] 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:15.214 "name": "Existed_Raid", 00:18:15.214 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:15.214 "strip_size_kb": 64, 00:18:15.214 "state": "configuring", 00:18:15.214 "raid_level": "raid0", 00:18:15.214 "superblock": false, 00:18:15.214 "num_base_bdevs": 4, 00:18:15.214 "num_base_bdevs_discovered": 3, 00:18:15.214 "num_base_bdevs_operational": 4, 00:18:15.214 "base_bdevs_list": [ 00:18:15.214 { 00:18:15.214 "name": "BaseBdev1", 00:18:15.214 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:15.214 "is_configured": true, 00:18:15.214 "data_offset": 0, 00:18:15.214 "data_size": 65536 00:18:15.214 }, 00:18:15.214 { 00:18:15.214 "name": null, 00:18:15.214 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:15.214 "is_configured": false, 00:18:15.214 "data_offset": 0, 00:18:15.214 "data_size": 65536 00:18:15.214 }, 00:18:15.214 { 00:18:15.214 "name": "BaseBdev3", 00:18:15.214 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:15.214 "is_configured": true, 00:18:15.214 "data_offset": 0, 00:18:15.214 "data_size": 65536 00:18:15.214 }, 00:18:15.214 { 00:18:15.214 "name": "BaseBdev4", 00:18:15.214 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:15.214 "is_configured": true, 00:18:15.214 "data_offset": 0, 00:18:15.214 "data_size": 65536 00:18:15.214 } 00:18:15.214 ] 00:18:15.214 }' 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:15.214 15:54:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:15.784 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.784 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:16.044 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:16.044 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:16.304 [2024-07-12 15:54:36.539084] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.304 "name": "Existed_Raid", 00:18:16.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:16.304 "strip_size_kb": 64, 00:18:16.304 "state": "configuring", 00:18:16.304 "raid_level": "raid0", 00:18:16.304 "superblock": false, 00:18:16.304 "num_base_bdevs": 4, 00:18:16.304 "num_base_bdevs_discovered": 2, 00:18:16.304 "num_base_bdevs_operational": 4, 00:18:16.304 "base_bdevs_list": [ 00:18:16.304 { 00:18:16.304 "name": "BaseBdev1", 00:18:16.304 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:16.304 "is_configured": true, 00:18:16.304 "data_offset": 0, 00:18:16.304 "data_size": 65536 00:18:16.304 }, 00:18:16.304 { 00:18:16.304 "name": null, 00:18:16.304 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:16.304 "is_configured": false, 00:18:16.304 "data_offset": 0, 00:18:16.304 "data_size": 65536 00:18:16.304 }, 00:18:16.304 { 00:18:16.304 "name": null, 00:18:16.304 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:16.304 "is_configured": false, 00:18:16.304 "data_offset": 0, 00:18:16.304 "data_size": 65536 00:18:16.304 }, 00:18:16.304 { 00:18:16.304 "name": "BaseBdev4", 00:18:16.304 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:16.304 "is_configured": true, 00:18:16.304 "data_offset": 0, 00:18:16.304 "data_size": 65536 00:18:16.304 } 00:18:16.304 ] 00:18:16.304 }' 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.304 15:54:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:16.873 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.873 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:17.133 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:17.133 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:17.393 [2024-07-12 15:54:37.673961] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.393 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:17.653 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.653 "name": "Existed_Raid", 00:18:17.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.653 "strip_size_kb": 64, 00:18:17.653 "state": "configuring", 00:18:17.653 "raid_level": "raid0", 00:18:17.653 "superblock": false, 00:18:17.653 "num_base_bdevs": 4, 00:18:17.653 "num_base_bdevs_discovered": 3, 00:18:17.653 "num_base_bdevs_operational": 4, 00:18:17.653 "base_bdevs_list": [ 00:18:17.653 { 00:18:17.653 "name": "BaseBdev1", 00:18:17.653 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:17.653 "is_configured": true, 00:18:17.653 "data_offset": 0, 00:18:17.653 "data_size": 65536 00:18:17.653 }, 00:18:17.653 { 00:18:17.653 "name": null, 00:18:17.653 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:17.653 "is_configured": false, 00:18:17.653 "data_offset": 0, 00:18:17.653 "data_size": 65536 00:18:17.653 }, 00:18:17.653 { 00:18:17.653 "name": "BaseBdev3", 00:18:17.653 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:17.653 "is_configured": true, 00:18:17.653 "data_offset": 0, 00:18:17.653 "data_size": 65536 00:18:17.653 }, 00:18:17.653 { 00:18:17.653 "name": "BaseBdev4", 00:18:17.653 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:17.653 "is_configured": true, 00:18:17.653 "data_offset": 0, 00:18:17.653 "data_size": 65536 00:18:17.653 } 00:18:17.653 ] 00:18:17.653 }' 00:18:17.653 15:54:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.653 15:54:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.222 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.222 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:18.222 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:18.222 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:18.482 [2024-07-12 15:54:38.800832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.482 15:54:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:18.742 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.742 "name": "Existed_Raid", 00:18:18.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.742 "strip_size_kb": 64, 00:18:18.742 "state": "configuring", 00:18:18.742 "raid_level": "raid0", 00:18:18.742 "superblock": false, 00:18:18.742 "num_base_bdevs": 4, 00:18:18.742 "num_base_bdevs_discovered": 2, 00:18:18.742 "num_base_bdevs_operational": 4, 00:18:18.742 "base_bdevs_list": [ 00:18:18.742 { 00:18:18.742 "name": null, 00:18:18.742 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:18.742 "is_configured": false, 00:18:18.742 "data_offset": 0, 00:18:18.742 "data_size": 65536 00:18:18.742 }, 00:18:18.742 { 00:18:18.742 "name": null, 00:18:18.742 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:18.742 "is_configured": false, 00:18:18.742 "data_offset": 0, 00:18:18.742 "data_size": 65536 00:18:18.742 }, 00:18:18.742 { 00:18:18.742 "name": "BaseBdev3", 00:18:18.742 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:18.742 "is_configured": true, 00:18:18.742 "data_offset": 0, 00:18:18.742 "data_size": 65536 00:18:18.742 }, 00:18:18.742 { 00:18:18.742 "name": "BaseBdev4", 00:18:18.742 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:18.742 "is_configured": true, 00:18:18.742 "data_offset": 0, 00:18:18.742 "data_size": 65536 00:18:18.742 } 00:18:18.742 ] 00:18:18.742 }' 00:18:18.742 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.742 15:54:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:19.312 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.312 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:19.312 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:19.312 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:19.572 [2024-07-12 15:54:39.933491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.572 15:54:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:19.831 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.831 "name": "Existed_Raid", 00:18:19.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:19.831 "strip_size_kb": 64, 00:18:19.831 "state": "configuring", 00:18:19.831 "raid_level": "raid0", 00:18:19.831 "superblock": false, 00:18:19.831 "num_base_bdevs": 4, 00:18:19.831 "num_base_bdevs_discovered": 3, 00:18:19.831 "num_base_bdevs_operational": 4, 00:18:19.831 "base_bdevs_list": [ 00:18:19.831 { 00:18:19.831 "name": null, 00:18:19.831 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:19.831 "is_configured": false, 00:18:19.831 "data_offset": 0, 00:18:19.831 "data_size": 65536 00:18:19.831 }, 00:18:19.831 { 00:18:19.831 "name": "BaseBdev2", 00:18:19.831 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:19.831 "is_configured": true, 00:18:19.831 "data_offset": 0, 00:18:19.831 "data_size": 65536 00:18:19.831 }, 00:18:19.831 { 00:18:19.831 "name": "BaseBdev3", 00:18:19.831 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:19.831 "is_configured": true, 00:18:19.831 "data_offset": 0, 00:18:19.831 "data_size": 65536 00:18:19.831 }, 00:18:19.831 { 00:18:19.831 "name": "BaseBdev4", 00:18:19.831 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:19.831 "is_configured": true, 00:18:19.831 "data_offset": 0, 00:18:19.831 "data_size": 65536 00:18:19.831 } 00:18:19.831 ] 00:18:19.831 }' 00:18:19.831 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.831 15:54:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.401 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.401 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:20.661 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:20.661 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.661 15:54:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:20.661 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e462b9b9-1dcf-44d1-9355-608cec31ec0d 00:18:20.921 [2024-07-12 15:54:41.249836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:20.921 [2024-07-12 15:54:41.249862] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13e38c0 00:18:20.921 [2024-07-12 15:54:41.249866] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:20.921 [2024-07-12 15:54:41.250020] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e84c0 00:18:20.921 [2024-07-12 15:54:41.250110] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13e38c0 00:18:20.921 [2024-07-12 15:54:41.250115] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x13e38c0 00:18:20.921 [2024-07-12 15:54:41.250234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:20.921 NewBaseBdev 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:20.921 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:21.180 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:21.440 [ 00:18:21.441 { 00:18:21.441 "name": "NewBaseBdev", 00:18:21.441 "aliases": [ 00:18:21.441 "e462b9b9-1dcf-44d1-9355-608cec31ec0d" 00:18:21.441 ], 00:18:21.441 "product_name": "Malloc disk", 00:18:21.441 "block_size": 512, 00:18:21.441 "num_blocks": 65536, 00:18:21.441 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:21.441 "assigned_rate_limits": { 00:18:21.441 "rw_ios_per_sec": 0, 00:18:21.441 "rw_mbytes_per_sec": 0, 00:18:21.441 "r_mbytes_per_sec": 0, 00:18:21.441 "w_mbytes_per_sec": 0 00:18:21.441 }, 00:18:21.441 "claimed": true, 00:18:21.441 "claim_type": "exclusive_write", 00:18:21.441 "zoned": false, 00:18:21.441 "supported_io_types": { 00:18:21.441 "read": true, 00:18:21.441 "write": true, 00:18:21.441 "unmap": true, 00:18:21.441 "flush": true, 00:18:21.441 "reset": true, 00:18:21.441 "nvme_admin": false, 00:18:21.441 "nvme_io": false, 00:18:21.441 "nvme_io_md": false, 00:18:21.441 "write_zeroes": true, 00:18:21.441 "zcopy": true, 00:18:21.441 "get_zone_info": false, 00:18:21.441 "zone_management": false, 00:18:21.441 "zone_append": false, 00:18:21.441 "compare": false, 00:18:21.441 "compare_and_write": false, 00:18:21.441 "abort": true, 00:18:21.441 "seek_hole": false, 00:18:21.441 "seek_data": false, 00:18:21.441 "copy": true, 00:18:21.441 "nvme_iov_md": false 00:18:21.441 }, 00:18:21.441 "memory_domains": [ 00:18:21.441 { 00:18:21.441 "dma_device_id": "system", 00:18:21.441 "dma_device_type": 1 00:18:21.441 }, 00:18:21.441 { 00:18:21.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.441 "dma_device_type": 2 00:18:21.441 } 00:18:21.441 ], 00:18:21.441 "driver_specific": {} 00:18:21.441 } 00:18:21.441 ] 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.441 "name": "Existed_Raid", 00:18:21.441 "uuid": "655f84cd-c0b9-40c9-80b2-95b047cad8a8", 00:18:21.441 "strip_size_kb": 64, 00:18:21.441 "state": "online", 00:18:21.441 "raid_level": "raid0", 00:18:21.441 "superblock": false, 00:18:21.441 "num_base_bdevs": 4, 00:18:21.441 "num_base_bdevs_discovered": 4, 00:18:21.441 "num_base_bdevs_operational": 4, 00:18:21.441 "base_bdevs_list": [ 00:18:21.441 { 00:18:21.441 "name": "NewBaseBdev", 00:18:21.441 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:21.441 "is_configured": true, 00:18:21.441 "data_offset": 0, 00:18:21.441 "data_size": 65536 00:18:21.441 }, 00:18:21.441 { 00:18:21.441 "name": "BaseBdev2", 00:18:21.441 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:21.441 "is_configured": true, 00:18:21.441 "data_offset": 0, 00:18:21.441 "data_size": 65536 00:18:21.441 }, 00:18:21.441 { 00:18:21.441 "name": "BaseBdev3", 00:18:21.441 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:21.441 "is_configured": true, 00:18:21.441 "data_offset": 0, 00:18:21.441 "data_size": 65536 00:18:21.441 }, 00:18:21.441 { 00:18:21.441 "name": "BaseBdev4", 00:18:21.441 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:21.441 "is_configured": true, 00:18:21.441 "data_offset": 0, 00:18:21.441 "data_size": 65536 00:18:21.441 } 00:18:21.441 ] 00:18:21.441 }' 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.441 15:54:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:22.016 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:22.318 [2024-07-12 15:54:42.573440] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:22.318 "name": "Existed_Raid", 00:18:22.318 "aliases": [ 00:18:22.318 "655f84cd-c0b9-40c9-80b2-95b047cad8a8" 00:18:22.318 ], 00:18:22.318 "product_name": "Raid Volume", 00:18:22.318 "block_size": 512, 00:18:22.318 "num_blocks": 262144, 00:18:22.318 "uuid": "655f84cd-c0b9-40c9-80b2-95b047cad8a8", 00:18:22.318 "assigned_rate_limits": { 00:18:22.318 "rw_ios_per_sec": 0, 00:18:22.318 "rw_mbytes_per_sec": 0, 00:18:22.318 "r_mbytes_per_sec": 0, 00:18:22.318 "w_mbytes_per_sec": 0 00:18:22.318 }, 00:18:22.318 "claimed": false, 00:18:22.318 "zoned": false, 00:18:22.318 "supported_io_types": { 00:18:22.318 "read": true, 00:18:22.318 "write": true, 00:18:22.318 "unmap": true, 00:18:22.318 "flush": true, 00:18:22.318 "reset": true, 00:18:22.318 "nvme_admin": false, 00:18:22.318 "nvme_io": false, 00:18:22.318 "nvme_io_md": false, 00:18:22.318 "write_zeroes": true, 00:18:22.318 "zcopy": false, 00:18:22.318 "get_zone_info": false, 00:18:22.318 "zone_management": false, 00:18:22.318 "zone_append": false, 00:18:22.318 "compare": false, 00:18:22.318 "compare_and_write": false, 00:18:22.318 "abort": false, 00:18:22.318 "seek_hole": false, 00:18:22.318 "seek_data": false, 00:18:22.318 "copy": false, 00:18:22.318 "nvme_iov_md": false 00:18:22.318 }, 00:18:22.318 "memory_domains": [ 00:18:22.318 { 00:18:22.318 "dma_device_id": "system", 00:18:22.318 "dma_device_type": 1 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.318 "dma_device_type": 2 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "system", 00:18:22.318 "dma_device_type": 1 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.318 "dma_device_type": 2 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "system", 00:18:22.318 "dma_device_type": 1 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.318 "dma_device_type": 2 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "system", 00:18:22.318 "dma_device_type": 1 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.318 "dma_device_type": 2 00:18:22.318 } 00:18:22.318 ], 00:18:22.318 "driver_specific": { 00:18:22.318 "raid": { 00:18:22.318 "uuid": "655f84cd-c0b9-40c9-80b2-95b047cad8a8", 00:18:22.318 "strip_size_kb": 64, 00:18:22.318 "state": "online", 00:18:22.318 "raid_level": "raid0", 00:18:22.318 "superblock": false, 00:18:22.318 "num_base_bdevs": 4, 00:18:22.318 "num_base_bdevs_discovered": 4, 00:18:22.318 "num_base_bdevs_operational": 4, 00:18:22.318 "base_bdevs_list": [ 00:18:22.318 { 00:18:22.318 "name": "NewBaseBdev", 00:18:22.318 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:22.318 "is_configured": true, 00:18:22.318 "data_offset": 0, 00:18:22.318 "data_size": 65536 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "name": "BaseBdev2", 00:18:22.318 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:22.318 "is_configured": true, 00:18:22.318 "data_offset": 0, 00:18:22.318 "data_size": 65536 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "name": "BaseBdev3", 00:18:22.318 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:22.318 "is_configured": true, 00:18:22.318 "data_offset": 0, 00:18:22.318 "data_size": 65536 00:18:22.318 }, 00:18:22.318 { 00:18:22.318 "name": "BaseBdev4", 00:18:22.318 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:22.318 "is_configured": true, 00:18:22.318 "data_offset": 0, 00:18:22.318 "data_size": 65536 00:18:22.318 } 00:18:22.318 ] 00:18:22.318 } 00:18:22.318 } 00:18:22.318 }' 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:22.318 BaseBdev2 00:18:22.318 BaseBdev3 00:18:22.318 BaseBdev4' 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:22.318 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.578 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.578 "name": "NewBaseBdev", 00:18:22.578 "aliases": [ 00:18:22.578 "e462b9b9-1dcf-44d1-9355-608cec31ec0d" 00:18:22.578 ], 00:18:22.578 "product_name": "Malloc disk", 00:18:22.578 "block_size": 512, 00:18:22.578 "num_blocks": 65536, 00:18:22.578 "uuid": "e462b9b9-1dcf-44d1-9355-608cec31ec0d", 00:18:22.578 "assigned_rate_limits": { 00:18:22.578 "rw_ios_per_sec": 0, 00:18:22.578 "rw_mbytes_per_sec": 0, 00:18:22.578 "r_mbytes_per_sec": 0, 00:18:22.578 "w_mbytes_per_sec": 0 00:18:22.578 }, 00:18:22.578 "claimed": true, 00:18:22.578 "claim_type": "exclusive_write", 00:18:22.578 "zoned": false, 00:18:22.578 "supported_io_types": { 00:18:22.578 "read": true, 00:18:22.578 "write": true, 00:18:22.578 "unmap": true, 00:18:22.578 "flush": true, 00:18:22.578 "reset": true, 00:18:22.578 "nvme_admin": false, 00:18:22.578 "nvme_io": false, 00:18:22.578 "nvme_io_md": false, 00:18:22.578 "write_zeroes": true, 00:18:22.578 "zcopy": true, 00:18:22.578 "get_zone_info": false, 00:18:22.578 "zone_management": false, 00:18:22.578 "zone_append": false, 00:18:22.578 "compare": false, 00:18:22.578 "compare_and_write": false, 00:18:22.578 "abort": true, 00:18:22.578 "seek_hole": false, 00:18:22.578 "seek_data": false, 00:18:22.578 "copy": true, 00:18:22.578 "nvme_iov_md": false 00:18:22.578 }, 00:18:22.578 "memory_domains": [ 00:18:22.578 { 00:18:22.578 "dma_device_id": "system", 00:18:22.578 "dma_device_type": 1 00:18:22.578 }, 00:18:22.578 { 00:18:22.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.578 "dma_device_type": 2 00:18:22.578 } 00:18:22.578 ], 00:18:22.579 "driver_specific": {} 00:18:22.579 }' 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.579 15:54:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.838 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:22.839 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.099 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.099 "name": "BaseBdev2", 00:18:23.099 "aliases": [ 00:18:23.099 "c8e4cc49-18ee-4228-a30b-a22ae241d6f8" 00:18:23.099 ], 00:18:23.099 "product_name": "Malloc disk", 00:18:23.099 "block_size": 512, 00:18:23.099 "num_blocks": 65536, 00:18:23.099 "uuid": "c8e4cc49-18ee-4228-a30b-a22ae241d6f8", 00:18:23.099 "assigned_rate_limits": { 00:18:23.099 "rw_ios_per_sec": 0, 00:18:23.099 "rw_mbytes_per_sec": 0, 00:18:23.099 "r_mbytes_per_sec": 0, 00:18:23.099 "w_mbytes_per_sec": 0 00:18:23.099 }, 00:18:23.099 "claimed": true, 00:18:23.099 "claim_type": "exclusive_write", 00:18:23.099 "zoned": false, 00:18:23.099 "supported_io_types": { 00:18:23.099 "read": true, 00:18:23.099 "write": true, 00:18:23.099 "unmap": true, 00:18:23.099 "flush": true, 00:18:23.099 "reset": true, 00:18:23.099 "nvme_admin": false, 00:18:23.099 "nvme_io": false, 00:18:23.099 "nvme_io_md": false, 00:18:23.099 "write_zeroes": true, 00:18:23.099 "zcopy": true, 00:18:23.099 "get_zone_info": false, 00:18:23.099 "zone_management": false, 00:18:23.099 "zone_append": false, 00:18:23.099 "compare": false, 00:18:23.099 "compare_and_write": false, 00:18:23.099 "abort": true, 00:18:23.099 "seek_hole": false, 00:18:23.099 "seek_data": false, 00:18:23.099 "copy": true, 00:18:23.099 "nvme_iov_md": false 00:18:23.099 }, 00:18:23.100 "memory_domains": [ 00:18:23.100 { 00:18:23.100 "dma_device_id": "system", 00:18:23.100 "dma_device_type": 1 00:18:23.100 }, 00:18:23.100 { 00:18:23.100 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.100 "dma_device_type": 2 00:18:23.100 } 00:18:23.100 ], 00:18:23.100 "driver_specific": {} 00:18:23.100 }' 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.100 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:23.359 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:23.620 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:23.620 "name": "BaseBdev3", 00:18:23.620 "aliases": [ 00:18:23.620 "7958be4f-9b22-455c-8b13-01dafa7d6d5e" 00:18:23.620 ], 00:18:23.620 "product_name": "Malloc disk", 00:18:23.620 "block_size": 512, 00:18:23.620 "num_blocks": 65536, 00:18:23.620 "uuid": "7958be4f-9b22-455c-8b13-01dafa7d6d5e", 00:18:23.620 "assigned_rate_limits": { 00:18:23.620 "rw_ios_per_sec": 0, 00:18:23.620 "rw_mbytes_per_sec": 0, 00:18:23.620 "r_mbytes_per_sec": 0, 00:18:23.620 "w_mbytes_per_sec": 0 00:18:23.620 }, 00:18:23.620 "claimed": true, 00:18:23.620 "claim_type": "exclusive_write", 00:18:23.620 "zoned": false, 00:18:23.620 "supported_io_types": { 00:18:23.620 "read": true, 00:18:23.620 "write": true, 00:18:23.620 "unmap": true, 00:18:23.620 "flush": true, 00:18:23.620 "reset": true, 00:18:23.620 "nvme_admin": false, 00:18:23.620 "nvme_io": false, 00:18:23.620 "nvme_io_md": false, 00:18:23.620 "write_zeroes": true, 00:18:23.620 "zcopy": true, 00:18:23.620 "get_zone_info": false, 00:18:23.620 "zone_management": false, 00:18:23.620 "zone_append": false, 00:18:23.620 "compare": false, 00:18:23.620 "compare_and_write": false, 00:18:23.620 "abort": true, 00:18:23.620 "seek_hole": false, 00:18:23.620 "seek_data": false, 00:18:23.620 "copy": true, 00:18:23.620 "nvme_iov_md": false 00:18:23.620 }, 00:18:23.620 "memory_domains": [ 00:18:23.620 { 00:18:23.620 "dma_device_id": "system", 00:18:23.620 "dma_device_type": 1 00:18:23.620 }, 00:18:23.620 { 00:18:23.620 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:23.620 "dma_device_type": 2 00:18:23.620 } 00:18:23.620 ], 00:18:23.620 "driver_specific": {} 00:18:23.620 }' 00:18:23.620 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.620 15:54:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:23.620 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:23.620 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.620 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:23.880 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:24.450 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:24.450 "name": "BaseBdev4", 00:18:24.450 "aliases": [ 00:18:24.450 "8e1df85e-50c5-4547-aa34-6431314fb84f" 00:18:24.450 ], 00:18:24.450 "product_name": "Malloc disk", 00:18:24.450 "block_size": 512, 00:18:24.450 "num_blocks": 65536, 00:18:24.450 "uuid": "8e1df85e-50c5-4547-aa34-6431314fb84f", 00:18:24.450 "assigned_rate_limits": { 00:18:24.450 "rw_ios_per_sec": 0, 00:18:24.450 "rw_mbytes_per_sec": 0, 00:18:24.450 "r_mbytes_per_sec": 0, 00:18:24.450 "w_mbytes_per_sec": 0 00:18:24.450 }, 00:18:24.450 "claimed": true, 00:18:24.450 "claim_type": "exclusive_write", 00:18:24.450 "zoned": false, 00:18:24.450 "supported_io_types": { 00:18:24.450 "read": true, 00:18:24.450 "write": true, 00:18:24.450 "unmap": true, 00:18:24.450 "flush": true, 00:18:24.450 "reset": true, 00:18:24.450 "nvme_admin": false, 00:18:24.450 "nvme_io": false, 00:18:24.450 "nvme_io_md": false, 00:18:24.450 "write_zeroes": true, 00:18:24.450 "zcopy": true, 00:18:24.450 "get_zone_info": false, 00:18:24.450 "zone_management": false, 00:18:24.450 "zone_append": false, 00:18:24.450 "compare": false, 00:18:24.450 "compare_and_write": false, 00:18:24.450 "abort": true, 00:18:24.450 "seek_hole": false, 00:18:24.450 "seek_data": false, 00:18:24.450 "copy": true, 00:18:24.450 "nvme_iov_md": false 00:18:24.450 }, 00:18:24.450 "memory_domains": [ 00:18:24.450 { 00:18:24.450 "dma_device_id": "system", 00:18:24.450 "dma_device_type": 1 00:18:24.450 }, 00:18:24.450 { 00:18:24.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.450 "dma_device_type": 2 00:18:24.450 } 00:18:24.450 ], 00:18:24.450 "driver_specific": {} 00:18:24.450 }' 00:18:24.450 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.450 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:24.450 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:24.450 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.709 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:24.709 15:54:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:24.710 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.710 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:24.710 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:24.710 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:24.969 [2024-07-12 15:54:45.356256] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:24.969 [2024-07-12 15:54:45.356275] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:24.969 [2024-07-12 15:54:45.356314] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:24.969 [2024-07-12 15:54:45.356359] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:24.969 [2024-07-12 15:54:45.356365] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13e38c0 name Existed_Raid, state offline 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2568087 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2568087 ']' 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2568087 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:24.969 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2568087 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2568087' 00:18:25.230 killing process with pid 2568087 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2568087 00:18:25.230 [2024-07-12 15:54:45.427118] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2568087 00:18:25.230 [2024-07-12 15:54:45.447380] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:25.230 00:18:25.230 real 0m27.680s 00:18:25.230 user 0m51.984s 00:18:25.230 sys 0m4.028s 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.230 ************************************ 00:18:25.230 END TEST raid_state_function_test 00:18:25.230 ************************************ 00:18:25.230 15:54:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:25.230 15:54:45 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:18:25.230 15:54:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:25.230 15:54:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:25.230 15:54:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:25.230 ************************************ 00:18:25.230 START TEST raid_state_function_test_sb 00:18:25.230 ************************************ 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:25.230 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2573352 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2573352' 00:18:25.231 Process raid pid: 2573352 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2573352 /var/tmp/spdk-raid.sock 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2573352 ']' 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:25.231 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:25.231 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:25.492 [2024-07-12 15:54:45.705209] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:18:25.492 [2024-07-12 15:54:45.705252] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:25.492 [2024-07-12 15:54:45.790078] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:25.492 [2024-07-12 15:54:45.853804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:25.492 [2024-07-12 15:54:45.903596] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:25.492 [2024-07-12 15:54:45.903619] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:26.432 [2024-07-12 15:54:46.703296] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:26.432 [2024-07-12 15:54:46.703330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:26.432 [2024-07-12 15:54:46.703336] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:26.432 [2024-07-12 15:54:46.703342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:26.432 [2024-07-12 15:54:46.703347] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:26.432 [2024-07-12 15:54:46.703352] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:26.432 [2024-07-12 15:54:46.703357] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:26.432 [2024-07-12 15:54:46.703362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.432 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.691 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.691 "name": "Existed_Raid", 00:18:26.691 "uuid": "5955773e-1c88-4fce-90b9-e759969965ec", 00:18:26.691 "strip_size_kb": 64, 00:18:26.692 "state": "configuring", 00:18:26.692 "raid_level": "raid0", 00:18:26.692 "superblock": true, 00:18:26.692 "num_base_bdevs": 4, 00:18:26.692 "num_base_bdevs_discovered": 0, 00:18:26.692 "num_base_bdevs_operational": 4, 00:18:26.692 "base_bdevs_list": [ 00:18:26.692 { 00:18:26.692 "name": "BaseBdev1", 00:18:26.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.692 "is_configured": false, 00:18:26.692 "data_offset": 0, 00:18:26.692 "data_size": 0 00:18:26.692 }, 00:18:26.692 { 00:18:26.692 "name": "BaseBdev2", 00:18:26.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.692 "is_configured": false, 00:18:26.692 "data_offset": 0, 00:18:26.692 "data_size": 0 00:18:26.692 }, 00:18:26.692 { 00:18:26.692 "name": "BaseBdev3", 00:18:26.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.692 "is_configured": false, 00:18:26.692 "data_offset": 0, 00:18:26.692 "data_size": 0 00:18:26.692 }, 00:18:26.692 { 00:18:26.692 "name": "BaseBdev4", 00:18:26.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:26.692 "is_configured": false, 00:18:26.692 "data_offset": 0, 00:18:26.692 "data_size": 0 00:18:26.692 } 00:18:26.692 ] 00:18:26.692 }' 00:18:26.692 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.692 15:54:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.260 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:27.260 [2024-07-12 15:54:47.649565] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:27.260 [2024-07-12 15:54:47.649583] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f73920 name Existed_Raid, state configuring 00:18:27.260 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:27.520 [2024-07-12 15:54:47.834061] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:27.520 [2024-07-12 15:54:47.834080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:27.520 [2024-07-12 15:54:47.834085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:27.520 [2024-07-12 15:54:47.834090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:27.520 [2024-07-12 15:54:47.834095] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:27.520 [2024-07-12 15:54:47.834100] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:27.520 [2024-07-12 15:54:47.834104] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:27.520 [2024-07-12 15:54:47.834110] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:27.520 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:27.781 [2024-07-12 15:54:48.024904] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:27.781 BaseBdev1 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:27.781 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:28.040 [ 00:18:28.040 { 00:18:28.040 "name": "BaseBdev1", 00:18:28.040 "aliases": [ 00:18:28.040 "5667a42f-a928-477d-b0da-5073f5d8f5e9" 00:18:28.040 ], 00:18:28.040 "product_name": "Malloc disk", 00:18:28.040 "block_size": 512, 00:18:28.040 "num_blocks": 65536, 00:18:28.040 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:28.040 "assigned_rate_limits": { 00:18:28.040 "rw_ios_per_sec": 0, 00:18:28.040 "rw_mbytes_per_sec": 0, 00:18:28.040 "r_mbytes_per_sec": 0, 00:18:28.040 "w_mbytes_per_sec": 0 00:18:28.040 }, 00:18:28.040 "claimed": true, 00:18:28.040 "claim_type": "exclusive_write", 00:18:28.040 "zoned": false, 00:18:28.041 "supported_io_types": { 00:18:28.041 "read": true, 00:18:28.041 "write": true, 00:18:28.041 "unmap": true, 00:18:28.041 "flush": true, 00:18:28.041 "reset": true, 00:18:28.041 "nvme_admin": false, 00:18:28.041 "nvme_io": false, 00:18:28.041 "nvme_io_md": false, 00:18:28.041 "write_zeroes": true, 00:18:28.041 "zcopy": true, 00:18:28.041 "get_zone_info": false, 00:18:28.041 "zone_management": false, 00:18:28.041 "zone_append": false, 00:18:28.041 "compare": false, 00:18:28.041 "compare_and_write": false, 00:18:28.041 "abort": true, 00:18:28.041 "seek_hole": false, 00:18:28.041 "seek_data": false, 00:18:28.041 "copy": true, 00:18:28.041 "nvme_iov_md": false 00:18:28.041 }, 00:18:28.041 "memory_domains": [ 00:18:28.041 { 00:18:28.041 "dma_device_id": "system", 00:18:28.041 "dma_device_type": 1 00:18:28.041 }, 00:18:28.041 { 00:18:28.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.041 "dma_device_type": 2 00:18:28.041 } 00:18:28.041 ], 00:18:28.041 "driver_specific": {} 00:18:28.041 } 00:18:28.041 ] 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:28.041 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.300 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.300 "name": "Existed_Raid", 00:18:28.300 "uuid": "c46397f0-5fa3-4ee1-99cd-eb1aafb9deda", 00:18:28.300 "strip_size_kb": 64, 00:18:28.300 "state": "configuring", 00:18:28.300 "raid_level": "raid0", 00:18:28.300 "superblock": true, 00:18:28.300 "num_base_bdevs": 4, 00:18:28.300 "num_base_bdevs_discovered": 1, 00:18:28.300 "num_base_bdevs_operational": 4, 00:18:28.300 "base_bdevs_list": [ 00:18:28.300 { 00:18:28.300 "name": "BaseBdev1", 00:18:28.300 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:28.300 "is_configured": true, 00:18:28.300 "data_offset": 2048, 00:18:28.300 "data_size": 63488 00:18:28.300 }, 00:18:28.300 { 00:18:28.300 "name": "BaseBdev2", 00:18:28.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.300 "is_configured": false, 00:18:28.300 "data_offset": 0, 00:18:28.300 "data_size": 0 00:18:28.300 }, 00:18:28.300 { 00:18:28.300 "name": "BaseBdev3", 00:18:28.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.300 "is_configured": false, 00:18:28.300 "data_offset": 0, 00:18:28.300 "data_size": 0 00:18:28.300 }, 00:18:28.300 { 00:18:28.300 "name": "BaseBdev4", 00:18:28.300 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:28.300 "is_configured": false, 00:18:28.300 "data_offset": 0, 00:18:28.300 "data_size": 0 00:18:28.300 } 00:18:28.300 ] 00:18:28.300 }' 00:18:28.300 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.300 15:54:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:28.870 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:29.130 [2024-07-12 15:54:49.320181] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:29.130 [2024-07-12 15:54:49.320212] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f73190 name Existed_Raid, state configuring 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:29.130 [2024-07-12 15:54:49.516718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:29.130 [2024-07-12 15:54:49.517823] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:29.130 [2024-07-12 15:54:49.517849] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:29.130 [2024-07-12 15:54:49.517855] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:29.130 [2024-07-12 15:54:49.517860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:29.130 [2024-07-12 15:54:49.517865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:29.130 [2024-07-12 15:54:49.517870] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.130 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.410 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.410 "name": "Existed_Raid", 00:18:29.410 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:29.410 "strip_size_kb": 64, 00:18:29.410 "state": "configuring", 00:18:29.410 "raid_level": "raid0", 00:18:29.410 "superblock": true, 00:18:29.410 "num_base_bdevs": 4, 00:18:29.410 "num_base_bdevs_discovered": 1, 00:18:29.410 "num_base_bdevs_operational": 4, 00:18:29.410 "base_bdevs_list": [ 00:18:29.410 { 00:18:29.410 "name": "BaseBdev1", 00:18:29.410 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:29.410 "is_configured": true, 00:18:29.410 "data_offset": 2048, 00:18:29.410 "data_size": 63488 00:18:29.410 }, 00:18:29.410 { 00:18:29.410 "name": "BaseBdev2", 00:18:29.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.410 "is_configured": false, 00:18:29.410 "data_offset": 0, 00:18:29.410 "data_size": 0 00:18:29.410 }, 00:18:29.410 { 00:18:29.410 "name": "BaseBdev3", 00:18:29.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.410 "is_configured": false, 00:18:29.410 "data_offset": 0, 00:18:29.410 "data_size": 0 00:18:29.410 }, 00:18:29.410 { 00:18:29.410 "name": "BaseBdev4", 00:18:29.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:29.410 "is_configured": false, 00:18:29.410 "data_offset": 0, 00:18:29.410 "data_size": 0 00:18:29.410 } 00:18:29.410 ] 00:18:29.410 }' 00:18:29.410 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.410 15:54:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.980 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:29.980 [2024-07-12 15:54:50.427896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:30.241 BaseBdev2 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:30.241 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:30.501 [ 00:18:30.501 { 00:18:30.501 "name": "BaseBdev2", 00:18:30.501 "aliases": [ 00:18:30.501 "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629" 00:18:30.501 ], 00:18:30.501 "product_name": "Malloc disk", 00:18:30.501 "block_size": 512, 00:18:30.501 "num_blocks": 65536, 00:18:30.501 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:30.501 "assigned_rate_limits": { 00:18:30.501 "rw_ios_per_sec": 0, 00:18:30.501 "rw_mbytes_per_sec": 0, 00:18:30.501 "r_mbytes_per_sec": 0, 00:18:30.501 "w_mbytes_per_sec": 0 00:18:30.501 }, 00:18:30.501 "claimed": true, 00:18:30.501 "claim_type": "exclusive_write", 00:18:30.501 "zoned": false, 00:18:30.501 "supported_io_types": { 00:18:30.501 "read": true, 00:18:30.501 "write": true, 00:18:30.501 "unmap": true, 00:18:30.501 "flush": true, 00:18:30.501 "reset": true, 00:18:30.501 "nvme_admin": false, 00:18:30.501 "nvme_io": false, 00:18:30.501 "nvme_io_md": false, 00:18:30.501 "write_zeroes": true, 00:18:30.501 "zcopy": true, 00:18:30.501 "get_zone_info": false, 00:18:30.501 "zone_management": false, 00:18:30.501 "zone_append": false, 00:18:30.501 "compare": false, 00:18:30.501 "compare_and_write": false, 00:18:30.501 "abort": true, 00:18:30.501 "seek_hole": false, 00:18:30.501 "seek_data": false, 00:18:30.501 "copy": true, 00:18:30.501 "nvme_iov_md": false 00:18:30.501 }, 00:18:30.501 "memory_domains": [ 00:18:30.501 { 00:18:30.501 "dma_device_id": "system", 00:18:30.501 "dma_device_type": 1 00:18:30.501 }, 00:18:30.501 { 00:18:30.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.501 "dma_device_type": 2 00:18:30.501 } 00:18:30.501 ], 00:18:30.501 "driver_specific": {} 00:18:30.501 } 00:18:30.501 ] 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.501 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.761 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.761 "name": "Existed_Raid", 00:18:30.761 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:30.761 "strip_size_kb": 64, 00:18:30.761 "state": "configuring", 00:18:30.761 "raid_level": "raid0", 00:18:30.761 "superblock": true, 00:18:30.761 "num_base_bdevs": 4, 00:18:30.761 "num_base_bdevs_discovered": 2, 00:18:30.761 "num_base_bdevs_operational": 4, 00:18:30.761 "base_bdevs_list": [ 00:18:30.761 { 00:18:30.761 "name": "BaseBdev1", 00:18:30.761 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:30.761 "is_configured": true, 00:18:30.761 "data_offset": 2048, 00:18:30.761 "data_size": 63488 00:18:30.761 }, 00:18:30.761 { 00:18:30.761 "name": "BaseBdev2", 00:18:30.761 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:30.761 "is_configured": true, 00:18:30.761 "data_offset": 2048, 00:18:30.761 "data_size": 63488 00:18:30.761 }, 00:18:30.761 { 00:18:30.761 "name": "BaseBdev3", 00:18:30.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.761 "is_configured": false, 00:18:30.761 "data_offset": 0, 00:18:30.761 "data_size": 0 00:18:30.761 }, 00:18:30.761 { 00:18:30.761 "name": "BaseBdev4", 00:18:30.761 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:30.761 "is_configured": false, 00:18:30.761 "data_offset": 0, 00:18:30.761 "data_size": 0 00:18:30.761 } 00:18:30.761 ] 00:18:30.761 }' 00:18:30.761 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.761 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.700 15:54:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:31.700 [2024-07-12 15:54:52.073025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:31.700 BaseBdev3 00:18:31.700 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:31.700 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:31.700 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:31.701 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:31.701 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:31.701 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:31.701 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:31.960 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:32.220 [ 00:18:32.220 { 00:18:32.220 "name": "BaseBdev3", 00:18:32.220 "aliases": [ 00:18:32.220 "4156f27f-c12f-4c05-9433-33e078d3d05d" 00:18:32.220 ], 00:18:32.220 "product_name": "Malloc disk", 00:18:32.220 "block_size": 512, 00:18:32.220 "num_blocks": 65536, 00:18:32.220 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:32.220 "assigned_rate_limits": { 00:18:32.220 "rw_ios_per_sec": 0, 00:18:32.220 "rw_mbytes_per_sec": 0, 00:18:32.220 "r_mbytes_per_sec": 0, 00:18:32.220 "w_mbytes_per_sec": 0 00:18:32.220 }, 00:18:32.220 "claimed": true, 00:18:32.220 "claim_type": "exclusive_write", 00:18:32.220 "zoned": false, 00:18:32.220 "supported_io_types": { 00:18:32.220 "read": true, 00:18:32.220 "write": true, 00:18:32.220 "unmap": true, 00:18:32.220 "flush": true, 00:18:32.220 "reset": true, 00:18:32.220 "nvme_admin": false, 00:18:32.220 "nvme_io": false, 00:18:32.220 "nvme_io_md": false, 00:18:32.220 "write_zeroes": true, 00:18:32.220 "zcopy": true, 00:18:32.220 "get_zone_info": false, 00:18:32.220 "zone_management": false, 00:18:32.220 "zone_append": false, 00:18:32.220 "compare": false, 00:18:32.220 "compare_and_write": false, 00:18:32.220 "abort": true, 00:18:32.220 "seek_hole": false, 00:18:32.220 "seek_data": false, 00:18:32.220 "copy": true, 00:18:32.220 "nvme_iov_md": false 00:18:32.220 }, 00:18:32.220 "memory_domains": [ 00:18:32.220 { 00:18:32.220 "dma_device_id": "system", 00:18:32.220 "dma_device_type": 1 00:18:32.220 }, 00:18:32.220 { 00:18:32.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.220 "dma_device_type": 2 00:18:32.220 } 00:18:32.220 ], 00:18:32.220 "driver_specific": {} 00:18:32.220 } 00:18:32.220 ] 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:32.220 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:32.220 "name": "Existed_Raid", 00:18:32.220 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:32.220 "strip_size_kb": 64, 00:18:32.220 "state": "configuring", 00:18:32.220 "raid_level": "raid0", 00:18:32.220 "superblock": true, 00:18:32.220 "num_base_bdevs": 4, 00:18:32.220 "num_base_bdevs_discovered": 3, 00:18:32.220 "num_base_bdevs_operational": 4, 00:18:32.220 "base_bdevs_list": [ 00:18:32.220 { 00:18:32.220 "name": "BaseBdev1", 00:18:32.220 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:32.220 "is_configured": true, 00:18:32.220 "data_offset": 2048, 00:18:32.220 "data_size": 63488 00:18:32.220 }, 00:18:32.220 { 00:18:32.220 "name": "BaseBdev2", 00:18:32.220 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:32.220 "is_configured": true, 00:18:32.221 "data_offset": 2048, 00:18:32.221 "data_size": 63488 00:18:32.221 }, 00:18:32.221 { 00:18:32.221 "name": "BaseBdev3", 00:18:32.221 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:32.221 "is_configured": true, 00:18:32.221 "data_offset": 2048, 00:18:32.221 "data_size": 63488 00:18:32.221 }, 00:18:32.221 { 00:18:32.221 "name": "BaseBdev4", 00:18:32.221 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:32.221 "is_configured": false, 00:18:32.221 "data_offset": 0, 00:18:32.221 "data_size": 0 00:18:32.221 } 00:18:32.221 ] 00:18:32.221 }' 00:18:32.221 15:54:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:32.221 15:54:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:32.794 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:33.053 [2024-07-12 15:54:53.353194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:33.054 [2024-07-12 15:54:53.353319] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f741d0 00:18:33.054 [2024-07-12 15:54:53.353327] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:33.054 [2024-07-12 15:54:53.353468] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f75220 00:18:33.054 [2024-07-12 15:54:53.353564] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f741d0 00:18:33.054 [2024-07-12 15:54:53.353570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f741d0 00:18:33.054 [2024-07-12 15:54:53.353637] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:33.054 BaseBdev4 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:33.054 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:33.314 [ 00:18:33.314 { 00:18:33.314 "name": "BaseBdev4", 00:18:33.314 "aliases": [ 00:18:33.314 "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb" 00:18:33.314 ], 00:18:33.314 "product_name": "Malloc disk", 00:18:33.314 "block_size": 512, 00:18:33.314 "num_blocks": 65536, 00:18:33.314 "uuid": "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb", 00:18:33.314 "assigned_rate_limits": { 00:18:33.314 "rw_ios_per_sec": 0, 00:18:33.314 "rw_mbytes_per_sec": 0, 00:18:33.314 "r_mbytes_per_sec": 0, 00:18:33.314 "w_mbytes_per_sec": 0 00:18:33.314 }, 00:18:33.314 "claimed": true, 00:18:33.314 "claim_type": "exclusive_write", 00:18:33.314 "zoned": false, 00:18:33.314 "supported_io_types": { 00:18:33.314 "read": true, 00:18:33.314 "write": true, 00:18:33.314 "unmap": true, 00:18:33.314 "flush": true, 00:18:33.314 "reset": true, 00:18:33.314 "nvme_admin": false, 00:18:33.314 "nvme_io": false, 00:18:33.314 "nvme_io_md": false, 00:18:33.314 "write_zeroes": true, 00:18:33.314 "zcopy": true, 00:18:33.314 "get_zone_info": false, 00:18:33.314 "zone_management": false, 00:18:33.314 "zone_append": false, 00:18:33.314 "compare": false, 00:18:33.314 "compare_and_write": false, 00:18:33.314 "abort": true, 00:18:33.314 "seek_hole": false, 00:18:33.314 "seek_data": false, 00:18:33.314 "copy": true, 00:18:33.314 "nvme_iov_md": false 00:18:33.314 }, 00:18:33.314 "memory_domains": [ 00:18:33.314 { 00:18:33.314 "dma_device_id": "system", 00:18:33.314 "dma_device_type": 1 00:18:33.314 }, 00:18:33.314 { 00:18:33.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.314 "dma_device_type": 2 00:18:33.314 } 00:18:33.314 ], 00:18:33.314 "driver_specific": {} 00:18:33.314 } 00:18:33.314 ] 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.314 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.575 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.575 "name": "Existed_Raid", 00:18:33.575 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:33.575 "strip_size_kb": 64, 00:18:33.575 "state": "online", 00:18:33.575 "raid_level": "raid0", 00:18:33.575 "superblock": true, 00:18:33.575 "num_base_bdevs": 4, 00:18:33.575 "num_base_bdevs_discovered": 4, 00:18:33.575 "num_base_bdevs_operational": 4, 00:18:33.575 "base_bdevs_list": [ 00:18:33.575 { 00:18:33.575 "name": "BaseBdev1", 00:18:33.575 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:33.575 "is_configured": true, 00:18:33.575 "data_offset": 2048, 00:18:33.575 "data_size": 63488 00:18:33.575 }, 00:18:33.575 { 00:18:33.575 "name": "BaseBdev2", 00:18:33.575 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:33.575 "is_configured": true, 00:18:33.575 "data_offset": 2048, 00:18:33.575 "data_size": 63488 00:18:33.575 }, 00:18:33.575 { 00:18:33.575 "name": "BaseBdev3", 00:18:33.575 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:33.575 "is_configured": true, 00:18:33.575 "data_offset": 2048, 00:18:33.575 "data_size": 63488 00:18:33.575 }, 00:18:33.575 { 00:18:33.575 "name": "BaseBdev4", 00:18:33.575 "uuid": "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb", 00:18:33.575 "is_configured": true, 00:18:33.575 "data_offset": 2048, 00:18:33.575 "data_size": 63488 00:18:33.575 } 00:18:33.575 ] 00:18:33.575 }' 00:18:33.575 15:54:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.575 15:54:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:34.144 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:34.404 [2024-07-12 15:54:54.720921] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:34.404 "name": "Existed_Raid", 00:18:34.404 "aliases": [ 00:18:34.404 "f651ff1f-fe19-4fb8-9946-d835b94326c7" 00:18:34.404 ], 00:18:34.404 "product_name": "Raid Volume", 00:18:34.404 "block_size": 512, 00:18:34.404 "num_blocks": 253952, 00:18:34.404 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:34.404 "assigned_rate_limits": { 00:18:34.404 "rw_ios_per_sec": 0, 00:18:34.404 "rw_mbytes_per_sec": 0, 00:18:34.404 "r_mbytes_per_sec": 0, 00:18:34.404 "w_mbytes_per_sec": 0 00:18:34.404 }, 00:18:34.404 "claimed": false, 00:18:34.404 "zoned": false, 00:18:34.404 "supported_io_types": { 00:18:34.404 "read": true, 00:18:34.404 "write": true, 00:18:34.404 "unmap": true, 00:18:34.404 "flush": true, 00:18:34.404 "reset": true, 00:18:34.404 "nvme_admin": false, 00:18:34.404 "nvme_io": false, 00:18:34.404 "nvme_io_md": false, 00:18:34.404 "write_zeroes": true, 00:18:34.404 "zcopy": false, 00:18:34.404 "get_zone_info": false, 00:18:34.404 "zone_management": false, 00:18:34.404 "zone_append": false, 00:18:34.404 "compare": false, 00:18:34.404 "compare_and_write": false, 00:18:34.404 "abort": false, 00:18:34.404 "seek_hole": false, 00:18:34.404 "seek_data": false, 00:18:34.404 "copy": false, 00:18:34.404 "nvme_iov_md": false 00:18:34.404 }, 00:18:34.404 "memory_domains": [ 00:18:34.404 { 00:18:34.404 "dma_device_id": "system", 00:18:34.404 "dma_device_type": 1 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.404 "dma_device_type": 2 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "system", 00:18:34.404 "dma_device_type": 1 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.404 "dma_device_type": 2 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "system", 00:18:34.404 "dma_device_type": 1 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.404 "dma_device_type": 2 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "system", 00:18:34.404 "dma_device_type": 1 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.404 "dma_device_type": 2 00:18:34.404 } 00:18:34.404 ], 00:18:34.404 "driver_specific": { 00:18:34.404 "raid": { 00:18:34.404 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:34.404 "strip_size_kb": 64, 00:18:34.404 "state": "online", 00:18:34.404 "raid_level": "raid0", 00:18:34.404 "superblock": true, 00:18:34.404 "num_base_bdevs": 4, 00:18:34.404 "num_base_bdevs_discovered": 4, 00:18:34.404 "num_base_bdevs_operational": 4, 00:18:34.404 "base_bdevs_list": [ 00:18:34.404 { 00:18:34.404 "name": "BaseBdev1", 00:18:34.404 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:34.404 "is_configured": true, 00:18:34.404 "data_offset": 2048, 00:18:34.404 "data_size": 63488 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "name": "BaseBdev2", 00:18:34.404 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:34.404 "is_configured": true, 00:18:34.404 "data_offset": 2048, 00:18:34.404 "data_size": 63488 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "name": "BaseBdev3", 00:18:34.404 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:34.404 "is_configured": true, 00:18:34.404 "data_offset": 2048, 00:18:34.404 "data_size": 63488 00:18:34.404 }, 00:18:34.404 { 00:18:34.404 "name": "BaseBdev4", 00:18:34.404 "uuid": "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb", 00:18:34.404 "is_configured": true, 00:18:34.404 "data_offset": 2048, 00:18:34.404 "data_size": 63488 00:18:34.404 } 00:18:34.404 ] 00:18:34.404 } 00:18:34.404 } 00:18:34.404 }' 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:34.404 BaseBdev2 00:18:34.404 BaseBdev3 00:18:34.404 BaseBdev4' 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:34.404 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.664 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:34.664 "name": "BaseBdev1", 00:18:34.664 "aliases": [ 00:18:34.664 "5667a42f-a928-477d-b0da-5073f5d8f5e9" 00:18:34.664 ], 00:18:34.664 "product_name": "Malloc disk", 00:18:34.664 "block_size": 512, 00:18:34.664 "num_blocks": 65536, 00:18:34.664 "uuid": "5667a42f-a928-477d-b0da-5073f5d8f5e9", 00:18:34.664 "assigned_rate_limits": { 00:18:34.664 "rw_ios_per_sec": 0, 00:18:34.664 "rw_mbytes_per_sec": 0, 00:18:34.664 "r_mbytes_per_sec": 0, 00:18:34.664 "w_mbytes_per_sec": 0 00:18:34.664 }, 00:18:34.664 "claimed": true, 00:18:34.664 "claim_type": "exclusive_write", 00:18:34.664 "zoned": false, 00:18:34.664 "supported_io_types": { 00:18:34.664 "read": true, 00:18:34.664 "write": true, 00:18:34.664 "unmap": true, 00:18:34.664 "flush": true, 00:18:34.664 "reset": true, 00:18:34.664 "nvme_admin": false, 00:18:34.664 "nvme_io": false, 00:18:34.664 "nvme_io_md": false, 00:18:34.664 "write_zeroes": true, 00:18:34.664 "zcopy": true, 00:18:34.664 "get_zone_info": false, 00:18:34.664 "zone_management": false, 00:18:34.664 "zone_append": false, 00:18:34.664 "compare": false, 00:18:34.664 "compare_and_write": false, 00:18:34.664 "abort": true, 00:18:34.664 "seek_hole": false, 00:18:34.664 "seek_data": false, 00:18:34.664 "copy": true, 00:18:34.664 "nvme_iov_md": false 00:18:34.664 }, 00:18:34.664 "memory_domains": [ 00:18:34.664 { 00:18:34.664 "dma_device_id": "system", 00:18:34.664 "dma_device_type": 1 00:18:34.664 }, 00:18:34.664 { 00:18:34.664 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.664 "dma_device_type": 2 00:18:34.664 } 00:18:34.664 ], 00:18:34.664 "driver_specific": {} 00:18:34.664 }' 00:18:34.664 15:54:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.665 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.665 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:34.665 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.665 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.924 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:35.184 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.185 "name": "BaseBdev2", 00:18:35.185 "aliases": [ 00:18:35.185 "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629" 00:18:35.185 ], 00:18:35.185 "product_name": "Malloc disk", 00:18:35.185 "block_size": 512, 00:18:35.185 "num_blocks": 65536, 00:18:35.185 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:35.185 "assigned_rate_limits": { 00:18:35.185 "rw_ios_per_sec": 0, 00:18:35.185 "rw_mbytes_per_sec": 0, 00:18:35.185 "r_mbytes_per_sec": 0, 00:18:35.185 "w_mbytes_per_sec": 0 00:18:35.185 }, 00:18:35.185 "claimed": true, 00:18:35.185 "claim_type": "exclusive_write", 00:18:35.185 "zoned": false, 00:18:35.185 "supported_io_types": { 00:18:35.185 "read": true, 00:18:35.185 "write": true, 00:18:35.185 "unmap": true, 00:18:35.185 "flush": true, 00:18:35.185 "reset": true, 00:18:35.185 "nvme_admin": false, 00:18:35.185 "nvme_io": false, 00:18:35.185 "nvme_io_md": false, 00:18:35.185 "write_zeroes": true, 00:18:35.185 "zcopy": true, 00:18:35.185 "get_zone_info": false, 00:18:35.185 "zone_management": false, 00:18:35.185 "zone_append": false, 00:18:35.185 "compare": false, 00:18:35.185 "compare_and_write": false, 00:18:35.185 "abort": true, 00:18:35.185 "seek_hole": false, 00:18:35.185 "seek_data": false, 00:18:35.185 "copy": true, 00:18:35.185 "nvme_iov_md": false 00:18:35.185 }, 00:18:35.185 "memory_domains": [ 00:18:35.185 { 00:18:35.185 "dma_device_id": "system", 00:18:35.185 "dma_device_type": 1 00:18:35.185 }, 00:18:35.185 { 00:18:35.185 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.185 "dma_device_type": 2 00:18:35.185 } 00:18:35.185 ], 00:18:35.185 "driver_specific": {} 00:18:35.185 }' 00:18:35.185 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.185 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.185 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.185 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:35.445 15:54:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.704 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.704 "name": "BaseBdev3", 00:18:35.704 "aliases": [ 00:18:35.704 "4156f27f-c12f-4c05-9433-33e078d3d05d" 00:18:35.704 ], 00:18:35.704 "product_name": "Malloc disk", 00:18:35.704 "block_size": 512, 00:18:35.704 "num_blocks": 65536, 00:18:35.704 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:35.704 "assigned_rate_limits": { 00:18:35.704 "rw_ios_per_sec": 0, 00:18:35.704 "rw_mbytes_per_sec": 0, 00:18:35.704 "r_mbytes_per_sec": 0, 00:18:35.704 "w_mbytes_per_sec": 0 00:18:35.704 }, 00:18:35.704 "claimed": true, 00:18:35.704 "claim_type": "exclusive_write", 00:18:35.704 "zoned": false, 00:18:35.704 "supported_io_types": { 00:18:35.704 "read": true, 00:18:35.704 "write": true, 00:18:35.704 "unmap": true, 00:18:35.704 "flush": true, 00:18:35.704 "reset": true, 00:18:35.704 "nvme_admin": false, 00:18:35.704 "nvme_io": false, 00:18:35.704 "nvme_io_md": false, 00:18:35.704 "write_zeroes": true, 00:18:35.704 "zcopy": true, 00:18:35.704 "get_zone_info": false, 00:18:35.704 "zone_management": false, 00:18:35.704 "zone_append": false, 00:18:35.705 "compare": false, 00:18:35.705 "compare_and_write": false, 00:18:35.705 "abort": true, 00:18:35.705 "seek_hole": false, 00:18:35.705 "seek_data": false, 00:18:35.705 "copy": true, 00:18:35.705 "nvme_iov_md": false 00:18:35.705 }, 00:18:35.705 "memory_domains": [ 00:18:35.705 { 00:18:35.705 "dma_device_id": "system", 00:18:35.705 "dma_device_type": 1 00:18:35.705 }, 00:18:35.705 { 00:18:35.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.705 "dma_device_type": 2 00:18:35.705 } 00:18:35.705 ], 00:18:35.705 "driver_specific": {} 00:18:35.705 }' 00:18:35.705 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.705 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.705 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.705 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.964 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:36.223 "name": "BaseBdev4", 00:18:36.223 "aliases": [ 00:18:36.223 "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb" 00:18:36.223 ], 00:18:36.223 "product_name": "Malloc disk", 00:18:36.223 "block_size": 512, 00:18:36.223 "num_blocks": 65536, 00:18:36.223 "uuid": "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb", 00:18:36.223 "assigned_rate_limits": { 00:18:36.223 "rw_ios_per_sec": 0, 00:18:36.223 "rw_mbytes_per_sec": 0, 00:18:36.223 "r_mbytes_per_sec": 0, 00:18:36.223 "w_mbytes_per_sec": 0 00:18:36.223 }, 00:18:36.223 "claimed": true, 00:18:36.223 "claim_type": "exclusive_write", 00:18:36.223 "zoned": false, 00:18:36.223 "supported_io_types": { 00:18:36.223 "read": true, 00:18:36.223 "write": true, 00:18:36.223 "unmap": true, 00:18:36.223 "flush": true, 00:18:36.223 "reset": true, 00:18:36.223 "nvme_admin": false, 00:18:36.223 "nvme_io": false, 00:18:36.223 "nvme_io_md": false, 00:18:36.223 "write_zeroes": true, 00:18:36.223 "zcopy": true, 00:18:36.223 "get_zone_info": false, 00:18:36.223 "zone_management": false, 00:18:36.223 "zone_append": false, 00:18:36.223 "compare": false, 00:18:36.223 "compare_and_write": false, 00:18:36.223 "abort": true, 00:18:36.223 "seek_hole": false, 00:18:36.223 "seek_data": false, 00:18:36.223 "copy": true, 00:18:36.223 "nvme_iov_md": false 00:18:36.223 }, 00:18:36.223 "memory_domains": [ 00:18:36.223 { 00:18:36.223 "dma_device_id": "system", 00:18:36.223 "dma_device_type": 1 00:18:36.223 }, 00:18:36.223 { 00:18:36.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:36.223 "dma_device_type": 2 00:18:36.223 } 00:18:36.223 ], 00:18:36.223 "driver_specific": {} 00:18:36.223 }' 00:18:36.223 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:36.482 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.742 15:54:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:36.742 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:36.742 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.742 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:36.742 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:36.742 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:37.002 [2024-07-12 15:54:57.303267] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:37.002 [2024-07-12 15:54:57.303288] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:37.002 [2024-07-12 15:54:57.303324] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.002 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:37.262 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:37.262 "name": "Existed_Raid", 00:18:37.262 "uuid": "f651ff1f-fe19-4fb8-9946-d835b94326c7", 00:18:37.262 "strip_size_kb": 64, 00:18:37.262 "state": "offline", 00:18:37.262 "raid_level": "raid0", 00:18:37.262 "superblock": true, 00:18:37.262 "num_base_bdevs": 4, 00:18:37.262 "num_base_bdevs_discovered": 3, 00:18:37.262 "num_base_bdevs_operational": 3, 00:18:37.262 "base_bdevs_list": [ 00:18:37.262 { 00:18:37.262 "name": null, 00:18:37.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:37.262 "is_configured": false, 00:18:37.262 "data_offset": 2048, 00:18:37.262 "data_size": 63488 00:18:37.262 }, 00:18:37.262 { 00:18:37.262 "name": "BaseBdev2", 00:18:37.262 "uuid": "cdd2bd90-ac03-4b35-9bd6-5f0c09d6f629", 00:18:37.262 "is_configured": true, 00:18:37.262 "data_offset": 2048, 00:18:37.262 "data_size": 63488 00:18:37.262 }, 00:18:37.262 { 00:18:37.262 "name": "BaseBdev3", 00:18:37.262 "uuid": "4156f27f-c12f-4c05-9433-33e078d3d05d", 00:18:37.262 "is_configured": true, 00:18:37.262 "data_offset": 2048, 00:18:37.262 "data_size": 63488 00:18:37.262 }, 00:18:37.262 { 00:18:37.262 "name": "BaseBdev4", 00:18:37.262 "uuid": "335e6e22-8e2c-4b0a-ba85-5ec183eb14bb", 00:18:37.262 "is_configured": true, 00:18:37.262 "data_offset": 2048, 00:18:37.262 "data_size": 63488 00:18:37.262 } 00:18:37.262 ] 00:18:37.262 }' 00:18:37.262 15:54:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:37.262 15:54:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:37.831 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:38.091 [2024-07-12 15:54:58.414083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:38.091 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.091 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.091 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.091 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:38.350 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:38.351 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:38.351 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:38.610 [2024-07-12 15:54:58.804929] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:38.610 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.610 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.610 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.610 15:54:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:38.610 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:38.610 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:38.610 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:38.878 [2024-07-12 15:54:59.191712] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:38.878 [2024-07-12 15:54:59.191745] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f741d0 name Existed_Raid, state offline 00:18:38.878 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:38.878 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:38.878 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.878 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:39.178 BaseBdev2 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:39.178 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.438 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:39.699 [ 00:18:39.699 { 00:18:39.699 "name": "BaseBdev2", 00:18:39.699 "aliases": [ 00:18:39.699 "f2a6c488-f4de-430b-a22e-697581e0ad6a" 00:18:39.699 ], 00:18:39.699 "product_name": "Malloc disk", 00:18:39.699 "block_size": 512, 00:18:39.699 "num_blocks": 65536, 00:18:39.699 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:39.699 "assigned_rate_limits": { 00:18:39.699 "rw_ios_per_sec": 0, 00:18:39.699 "rw_mbytes_per_sec": 0, 00:18:39.699 "r_mbytes_per_sec": 0, 00:18:39.699 "w_mbytes_per_sec": 0 00:18:39.699 }, 00:18:39.699 "claimed": false, 00:18:39.699 "zoned": false, 00:18:39.699 "supported_io_types": { 00:18:39.699 "read": true, 00:18:39.699 "write": true, 00:18:39.699 "unmap": true, 00:18:39.699 "flush": true, 00:18:39.699 "reset": true, 00:18:39.699 "nvme_admin": false, 00:18:39.699 "nvme_io": false, 00:18:39.699 "nvme_io_md": false, 00:18:39.699 "write_zeroes": true, 00:18:39.699 "zcopy": true, 00:18:39.699 "get_zone_info": false, 00:18:39.699 "zone_management": false, 00:18:39.699 "zone_append": false, 00:18:39.699 "compare": false, 00:18:39.699 "compare_and_write": false, 00:18:39.699 "abort": true, 00:18:39.699 "seek_hole": false, 00:18:39.699 "seek_data": false, 00:18:39.699 "copy": true, 00:18:39.699 "nvme_iov_md": false 00:18:39.699 }, 00:18:39.699 "memory_domains": [ 00:18:39.699 { 00:18:39.699 "dma_device_id": "system", 00:18:39.699 "dma_device_type": 1 00:18:39.699 }, 00:18:39.699 { 00:18:39.699 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.699 "dma_device_type": 2 00:18:39.699 } 00:18:39.699 ], 00:18:39.699 "driver_specific": {} 00:18:39.700 } 00:18:39.700 ] 00:18:39.700 15:54:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:39.700 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:39.700 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:39.700 15:54:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:39.960 BaseBdev3 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.960 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:40.220 [ 00:18:40.220 { 00:18:40.220 "name": "BaseBdev3", 00:18:40.220 "aliases": [ 00:18:40.220 "c3270ebf-07d2-48d8-b878-fb7f5f4a304f" 00:18:40.220 ], 00:18:40.220 "product_name": "Malloc disk", 00:18:40.220 "block_size": 512, 00:18:40.220 "num_blocks": 65536, 00:18:40.220 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:40.220 "assigned_rate_limits": { 00:18:40.220 "rw_ios_per_sec": 0, 00:18:40.220 "rw_mbytes_per_sec": 0, 00:18:40.220 "r_mbytes_per_sec": 0, 00:18:40.220 "w_mbytes_per_sec": 0 00:18:40.220 }, 00:18:40.220 "claimed": false, 00:18:40.220 "zoned": false, 00:18:40.220 "supported_io_types": { 00:18:40.220 "read": true, 00:18:40.220 "write": true, 00:18:40.220 "unmap": true, 00:18:40.220 "flush": true, 00:18:40.220 "reset": true, 00:18:40.220 "nvme_admin": false, 00:18:40.220 "nvme_io": false, 00:18:40.220 "nvme_io_md": false, 00:18:40.220 "write_zeroes": true, 00:18:40.220 "zcopy": true, 00:18:40.220 "get_zone_info": false, 00:18:40.220 "zone_management": false, 00:18:40.220 "zone_append": false, 00:18:40.220 "compare": false, 00:18:40.220 "compare_and_write": false, 00:18:40.220 "abort": true, 00:18:40.220 "seek_hole": false, 00:18:40.220 "seek_data": false, 00:18:40.220 "copy": true, 00:18:40.220 "nvme_iov_md": false 00:18:40.220 }, 00:18:40.220 "memory_domains": [ 00:18:40.220 { 00:18:40.220 "dma_device_id": "system", 00:18:40.220 "dma_device_type": 1 00:18:40.220 }, 00:18:40.220 { 00:18:40.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.220 "dma_device_type": 2 00:18:40.220 } 00:18:40.220 ], 00:18:40.220 "driver_specific": {} 00:18:40.220 } 00:18:40.220 ] 00:18:40.220 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:40.220 15:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:40.220 15:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:40.220 15:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:40.489 BaseBdev4 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:40.489 15:55:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:40.753 [ 00:18:40.753 { 00:18:40.753 "name": "BaseBdev4", 00:18:40.753 "aliases": [ 00:18:40.753 "fbf7ac42-4a08-4f79-b879-47d983786f25" 00:18:40.753 ], 00:18:40.753 "product_name": "Malloc disk", 00:18:40.753 "block_size": 512, 00:18:40.753 "num_blocks": 65536, 00:18:40.753 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:40.753 "assigned_rate_limits": { 00:18:40.753 "rw_ios_per_sec": 0, 00:18:40.753 "rw_mbytes_per_sec": 0, 00:18:40.753 "r_mbytes_per_sec": 0, 00:18:40.753 "w_mbytes_per_sec": 0 00:18:40.753 }, 00:18:40.753 "claimed": false, 00:18:40.753 "zoned": false, 00:18:40.753 "supported_io_types": { 00:18:40.753 "read": true, 00:18:40.753 "write": true, 00:18:40.753 "unmap": true, 00:18:40.753 "flush": true, 00:18:40.753 "reset": true, 00:18:40.753 "nvme_admin": false, 00:18:40.753 "nvme_io": false, 00:18:40.753 "nvme_io_md": false, 00:18:40.753 "write_zeroes": true, 00:18:40.753 "zcopy": true, 00:18:40.753 "get_zone_info": false, 00:18:40.753 "zone_management": false, 00:18:40.753 "zone_append": false, 00:18:40.753 "compare": false, 00:18:40.753 "compare_and_write": false, 00:18:40.753 "abort": true, 00:18:40.753 "seek_hole": false, 00:18:40.753 "seek_data": false, 00:18:40.753 "copy": true, 00:18:40.753 "nvme_iov_md": false 00:18:40.753 }, 00:18:40.753 "memory_domains": [ 00:18:40.753 { 00:18:40.753 "dma_device_id": "system", 00:18:40.753 "dma_device_type": 1 00:18:40.753 }, 00:18:40.753 { 00:18:40.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.753 "dma_device_type": 2 00:18:40.753 } 00:18:40.753 ], 00:18:40.753 "driver_specific": {} 00:18:40.753 } 00:18:40.753 ] 00:18:40.753 15:55:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:40.753 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:40.753 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:40.753 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:41.014 [2024-07-12 15:55:01.282659] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:41.014 [2024-07-12 15:55:01.282692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:41.014 [2024-07-12 15:55:01.282706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:41.014 [2024-07-12 15:55:01.283749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:41.014 [2024-07-12 15:55:01.283782] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.014 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.275 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.275 "name": "Existed_Raid", 00:18:41.275 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:41.275 "strip_size_kb": 64, 00:18:41.275 "state": "configuring", 00:18:41.275 "raid_level": "raid0", 00:18:41.275 "superblock": true, 00:18:41.275 "num_base_bdevs": 4, 00:18:41.275 "num_base_bdevs_discovered": 3, 00:18:41.275 "num_base_bdevs_operational": 4, 00:18:41.275 "base_bdevs_list": [ 00:18:41.275 { 00:18:41.275 "name": "BaseBdev1", 00:18:41.275 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:41.275 "is_configured": false, 00:18:41.275 "data_offset": 0, 00:18:41.275 "data_size": 0 00:18:41.275 }, 00:18:41.275 { 00:18:41.275 "name": "BaseBdev2", 00:18:41.275 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:41.275 "is_configured": true, 00:18:41.275 "data_offset": 2048, 00:18:41.275 "data_size": 63488 00:18:41.275 }, 00:18:41.275 { 00:18:41.275 "name": "BaseBdev3", 00:18:41.275 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:41.275 "is_configured": true, 00:18:41.275 "data_offset": 2048, 00:18:41.275 "data_size": 63488 00:18:41.275 }, 00:18:41.275 { 00:18:41.275 "name": "BaseBdev4", 00:18:41.275 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:41.275 "is_configured": true, 00:18:41.275 "data_offset": 2048, 00:18:41.275 "data_size": 63488 00:18:41.275 } 00:18:41.275 ] 00:18:41.275 }' 00:18:41.275 15:55:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.275 15:55:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:41.844 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:42.104 [2024-07-12 15:55:02.333311] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:42.104 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:42.104 "name": "Existed_Raid", 00:18:42.105 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:42.105 "strip_size_kb": 64, 00:18:42.105 "state": "configuring", 00:18:42.105 "raid_level": "raid0", 00:18:42.105 "superblock": true, 00:18:42.105 "num_base_bdevs": 4, 00:18:42.105 "num_base_bdevs_discovered": 2, 00:18:42.105 "num_base_bdevs_operational": 4, 00:18:42.105 "base_bdevs_list": [ 00:18:42.105 { 00:18:42.105 "name": "BaseBdev1", 00:18:42.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:42.105 "is_configured": false, 00:18:42.105 "data_offset": 0, 00:18:42.105 "data_size": 0 00:18:42.105 }, 00:18:42.105 { 00:18:42.105 "name": null, 00:18:42.105 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:42.105 "is_configured": false, 00:18:42.105 "data_offset": 2048, 00:18:42.105 "data_size": 63488 00:18:42.105 }, 00:18:42.105 { 00:18:42.105 "name": "BaseBdev3", 00:18:42.105 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:42.105 "is_configured": true, 00:18:42.105 "data_offset": 2048, 00:18:42.105 "data_size": 63488 00:18:42.105 }, 00:18:42.105 { 00:18:42.105 "name": "BaseBdev4", 00:18:42.105 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:42.105 "is_configured": true, 00:18:42.105 "data_offset": 2048, 00:18:42.105 "data_size": 63488 00:18:42.105 } 00:18:42.105 ] 00:18:42.105 }' 00:18:42.105 15:55:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:42.105 15:55:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.674 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.674 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:42.934 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:42.934 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:43.195 [2024-07-12 15:55:03.469195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:43.195 BaseBdev1 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:43.195 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:43.455 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:43.455 [ 00:18:43.455 { 00:18:43.455 "name": "BaseBdev1", 00:18:43.455 "aliases": [ 00:18:43.455 "f1773972-1936-4285-b4a4-f74e7c370573" 00:18:43.455 ], 00:18:43.455 "product_name": "Malloc disk", 00:18:43.455 "block_size": 512, 00:18:43.455 "num_blocks": 65536, 00:18:43.455 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:43.455 "assigned_rate_limits": { 00:18:43.455 "rw_ios_per_sec": 0, 00:18:43.455 "rw_mbytes_per_sec": 0, 00:18:43.455 "r_mbytes_per_sec": 0, 00:18:43.455 "w_mbytes_per_sec": 0 00:18:43.455 }, 00:18:43.455 "claimed": true, 00:18:43.455 "claim_type": "exclusive_write", 00:18:43.455 "zoned": false, 00:18:43.455 "supported_io_types": { 00:18:43.455 "read": true, 00:18:43.455 "write": true, 00:18:43.455 "unmap": true, 00:18:43.455 "flush": true, 00:18:43.455 "reset": true, 00:18:43.455 "nvme_admin": false, 00:18:43.455 "nvme_io": false, 00:18:43.455 "nvme_io_md": false, 00:18:43.455 "write_zeroes": true, 00:18:43.455 "zcopy": true, 00:18:43.455 "get_zone_info": false, 00:18:43.455 "zone_management": false, 00:18:43.455 "zone_append": false, 00:18:43.455 "compare": false, 00:18:43.455 "compare_and_write": false, 00:18:43.455 "abort": true, 00:18:43.455 "seek_hole": false, 00:18:43.455 "seek_data": false, 00:18:43.455 "copy": true, 00:18:43.455 "nvme_iov_md": false 00:18:43.455 }, 00:18:43.455 "memory_domains": [ 00:18:43.455 { 00:18:43.455 "dma_device_id": "system", 00:18:43.456 "dma_device_type": 1 00:18:43.456 }, 00:18:43.456 { 00:18:43.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.456 "dma_device_type": 2 00:18:43.456 } 00:18:43.456 ], 00:18:43.456 "driver_specific": {} 00:18:43.456 } 00:18:43.456 ] 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:43.456 15:55:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.026 15:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:44.026 "name": "Existed_Raid", 00:18:44.026 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:44.026 "strip_size_kb": 64, 00:18:44.026 "state": "configuring", 00:18:44.026 "raid_level": "raid0", 00:18:44.026 "superblock": true, 00:18:44.026 "num_base_bdevs": 4, 00:18:44.026 "num_base_bdevs_discovered": 3, 00:18:44.026 "num_base_bdevs_operational": 4, 00:18:44.026 "base_bdevs_list": [ 00:18:44.026 { 00:18:44.026 "name": "BaseBdev1", 00:18:44.026 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:44.026 "is_configured": true, 00:18:44.026 "data_offset": 2048, 00:18:44.026 "data_size": 63488 00:18:44.026 }, 00:18:44.026 { 00:18:44.026 "name": null, 00:18:44.026 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:44.026 "is_configured": false, 00:18:44.026 "data_offset": 2048, 00:18:44.026 "data_size": 63488 00:18:44.026 }, 00:18:44.026 { 00:18:44.026 "name": "BaseBdev3", 00:18:44.026 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:44.026 "is_configured": true, 00:18:44.026 "data_offset": 2048, 00:18:44.026 "data_size": 63488 00:18:44.026 }, 00:18:44.026 { 00:18:44.026 "name": "BaseBdev4", 00:18:44.026 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:44.026 "is_configured": true, 00:18:44.026 "data_offset": 2048, 00:18:44.026 "data_size": 63488 00:18:44.026 } 00:18:44.026 ] 00:18:44.026 }' 00:18:44.026 15:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:44.026 15:55:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:44.594 15:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.594 15:55:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:44.854 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:44.854 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:45.114 [2024-07-12 15:55:05.306008] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.114 "name": "Existed_Raid", 00:18:45.114 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:45.114 "strip_size_kb": 64, 00:18:45.114 "state": "configuring", 00:18:45.114 "raid_level": "raid0", 00:18:45.114 "superblock": true, 00:18:45.114 "num_base_bdevs": 4, 00:18:45.114 "num_base_bdevs_discovered": 2, 00:18:45.114 "num_base_bdevs_operational": 4, 00:18:45.114 "base_bdevs_list": [ 00:18:45.114 { 00:18:45.114 "name": "BaseBdev1", 00:18:45.114 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:45.114 "is_configured": true, 00:18:45.114 "data_offset": 2048, 00:18:45.114 "data_size": 63488 00:18:45.114 }, 00:18:45.114 { 00:18:45.114 "name": null, 00:18:45.114 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:45.114 "is_configured": false, 00:18:45.114 "data_offset": 2048, 00:18:45.114 "data_size": 63488 00:18:45.114 }, 00:18:45.114 { 00:18:45.114 "name": null, 00:18:45.114 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:45.114 "is_configured": false, 00:18:45.114 "data_offset": 2048, 00:18:45.114 "data_size": 63488 00:18:45.114 }, 00:18:45.114 { 00:18:45.114 "name": "BaseBdev4", 00:18:45.114 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:45.114 "is_configured": true, 00:18:45.114 "data_offset": 2048, 00:18:45.114 "data_size": 63488 00:18:45.114 } 00:18:45.114 ] 00:18:45.114 }' 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.114 15:55:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:46.053 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.053 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:46.314 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:46.314 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:46.314 [2024-07-12 15:55:06.757684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.574 15:55:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:47.144 15:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.144 "name": "Existed_Raid", 00:18:47.144 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:47.144 "strip_size_kb": 64, 00:18:47.144 "state": "configuring", 00:18:47.144 "raid_level": "raid0", 00:18:47.144 "superblock": true, 00:18:47.144 "num_base_bdevs": 4, 00:18:47.144 "num_base_bdevs_discovered": 3, 00:18:47.144 "num_base_bdevs_operational": 4, 00:18:47.144 "base_bdevs_list": [ 00:18:47.144 { 00:18:47.144 "name": "BaseBdev1", 00:18:47.144 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:47.144 "is_configured": true, 00:18:47.144 "data_offset": 2048, 00:18:47.144 "data_size": 63488 00:18:47.144 }, 00:18:47.144 { 00:18:47.144 "name": null, 00:18:47.144 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:47.144 "is_configured": false, 00:18:47.144 "data_offset": 2048, 00:18:47.144 "data_size": 63488 00:18:47.144 }, 00:18:47.144 { 00:18:47.144 "name": "BaseBdev3", 00:18:47.144 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:47.144 "is_configured": true, 00:18:47.144 "data_offset": 2048, 00:18:47.144 "data_size": 63488 00:18:47.144 }, 00:18:47.144 { 00:18:47.144 "name": "BaseBdev4", 00:18:47.144 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:47.144 "is_configured": true, 00:18:47.144 "data_offset": 2048, 00:18:47.144 "data_size": 63488 00:18:47.144 } 00:18:47.144 ] 00:18:47.144 }' 00:18:47.144 15:55:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.144 15:55:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:48.084 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.084 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:48.084 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:48.084 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:48.654 [2024-07-12 15:55:08.931207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:48.654 15:55:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:48.913 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:48.914 "name": "Existed_Raid", 00:18:48.914 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:48.914 "strip_size_kb": 64, 00:18:48.914 "state": "configuring", 00:18:48.914 "raid_level": "raid0", 00:18:48.914 "superblock": true, 00:18:48.914 "num_base_bdevs": 4, 00:18:48.914 "num_base_bdevs_discovered": 2, 00:18:48.914 "num_base_bdevs_operational": 4, 00:18:48.914 "base_bdevs_list": [ 00:18:48.914 { 00:18:48.914 "name": null, 00:18:48.914 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:48.914 "is_configured": false, 00:18:48.914 "data_offset": 2048, 00:18:48.914 "data_size": 63488 00:18:48.914 }, 00:18:48.914 { 00:18:48.914 "name": null, 00:18:48.914 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:48.914 "is_configured": false, 00:18:48.914 "data_offset": 2048, 00:18:48.914 "data_size": 63488 00:18:48.914 }, 00:18:48.914 { 00:18:48.914 "name": "BaseBdev3", 00:18:48.914 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:48.914 "is_configured": true, 00:18:48.914 "data_offset": 2048, 00:18:48.914 "data_size": 63488 00:18:48.914 }, 00:18:48.914 { 00:18:48.914 "name": "BaseBdev4", 00:18:48.914 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:48.914 "is_configured": true, 00:18:48.914 "data_offset": 2048, 00:18:48.914 "data_size": 63488 00:18:48.914 } 00:18:48.914 ] 00:18:48.914 }' 00:18:48.914 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:48.914 15:55:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:49.483 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:49.483 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:49.483 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:49.483 15:55:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:50.052 [2024-07-12 15:55:10.408761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.052 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.311 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.311 "name": "Existed_Raid", 00:18:50.311 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:50.311 "strip_size_kb": 64, 00:18:50.311 "state": "configuring", 00:18:50.311 "raid_level": "raid0", 00:18:50.311 "superblock": true, 00:18:50.311 "num_base_bdevs": 4, 00:18:50.312 "num_base_bdevs_discovered": 3, 00:18:50.312 "num_base_bdevs_operational": 4, 00:18:50.312 "base_bdevs_list": [ 00:18:50.312 { 00:18:50.312 "name": null, 00:18:50.312 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:50.312 "is_configured": false, 00:18:50.312 "data_offset": 2048, 00:18:50.312 "data_size": 63488 00:18:50.312 }, 00:18:50.312 { 00:18:50.312 "name": "BaseBdev2", 00:18:50.312 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:50.312 "is_configured": true, 00:18:50.312 "data_offset": 2048, 00:18:50.312 "data_size": 63488 00:18:50.312 }, 00:18:50.312 { 00:18:50.312 "name": "BaseBdev3", 00:18:50.312 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:50.312 "is_configured": true, 00:18:50.312 "data_offset": 2048, 00:18:50.312 "data_size": 63488 00:18:50.312 }, 00:18:50.312 { 00:18:50.312 "name": "BaseBdev4", 00:18:50.312 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:50.312 "is_configured": true, 00:18:50.312 "data_offset": 2048, 00:18:50.312 "data_size": 63488 00:18:50.312 } 00:18:50.312 ] 00:18:50.312 }' 00:18:50.312 15:55:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.312 15:55:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:50.878 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.879 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:51.138 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:51.138 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:51.138 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.138 15:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f1773972-1936-4285-b4a4-f74e7c370573 00:18:51.708 [2024-07-12 15:55:12.089991] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:51.708 [2024-07-12 15:55:12.090113] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f76c90 00:18:51.708 [2024-07-12 15:55:12.090120] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:51.708 [2024-07-12 15:55:12.090260] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f738f0 00:18:51.708 [2024-07-12 15:55:12.090350] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f76c90 00:18:51.708 [2024-07-12 15:55:12.090356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f76c90 00:18:51.708 [2024-07-12 15:55:12.090424] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:51.708 NewBaseBdev 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:51.708 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:51.968 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:52.537 [ 00:18:52.537 { 00:18:52.537 "name": "NewBaseBdev", 00:18:52.537 "aliases": [ 00:18:52.537 "f1773972-1936-4285-b4a4-f74e7c370573" 00:18:52.537 ], 00:18:52.537 "product_name": "Malloc disk", 00:18:52.537 "block_size": 512, 00:18:52.537 "num_blocks": 65536, 00:18:52.537 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:52.537 "assigned_rate_limits": { 00:18:52.537 "rw_ios_per_sec": 0, 00:18:52.537 "rw_mbytes_per_sec": 0, 00:18:52.537 "r_mbytes_per_sec": 0, 00:18:52.537 "w_mbytes_per_sec": 0 00:18:52.537 }, 00:18:52.537 "claimed": true, 00:18:52.537 "claim_type": "exclusive_write", 00:18:52.538 "zoned": false, 00:18:52.538 "supported_io_types": { 00:18:52.538 "read": true, 00:18:52.538 "write": true, 00:18:52.538 "unmap": true, 00:18:52.538 "flush": true, 00:18:52.538 "reset": true, 00:18:52.538 "nvme_admin": false, 00:18:52.538 "nvme_io": false, 00:18:52.538 "nvme_io_md": false, 00:18:52.538 "write_zeroes": true, 00:18:52.538 "zcopy": true, 00:18:52.538 "get_zone_info": false, 00:18:52.538 "zone_management": false, 00:18:52.538 "zone_append": false, 00:18:52.538 "compare": false, 00:18:52.538 "compare_and_write": false, 00:18:52.538 "abort": true, 00:18:52.538 "seek_hole": false, 00:18:52.538 "seek_data": false, 00:18:52.538 "copy": true, 00:18:52.538 "nvme_iov_md": false 00:18:52.538 }, 00:18:52.538 "memory_domains": [ 00:18:52.538 { 00:18:52.538 "dma_device_id": "system", 00:18:52.538 "dma_device_type": 1 00:18:52.538 }, 00:18:52.538 { 00:18:52.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:52.538 "dma_device_type": 2 00:18:52.538 } 00:18:52.538 ], 00:18:52.538 "driver_specific": {} 00:18:52.538 } 00:18:52.538 ] 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.538 15:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:52.797 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:52.797 "name": "Existed_Raid", 00:18:52.797 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:52.797 "strip_size_kb": 64, 00:18:52.797 "state": "online", 00:18:52.797 "raid_level": "raid0", 00:18:52.797 "superblock": true, 00:18:52.797 "num_base_bdevs": 4, 00:18:52.797 "num_base_bdevs_discovered": 4, 00:18:52.797 "num_base_bdevs_operational": 4, 00:18:52.797 "base_bdevs_list": [ 00:18:52.797 { 00:18:52.798 "name": "NewBaseBdev", 00:18:52.798 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:52.798 "is_configured": true, 00:18:52.798 "data_offset": 2048, 00:18:52.798 "data_size": 63488 00:18:52.798 }, 00:18:52.798 { 00:18:52.798 "name": "BaseBdev2", 00:18:52.798 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:52.798 "is_configured": true, 00:18:52.798 "data_offset": 2048, 00:18:52.798 "data_size": 63488 00:18:52.798 }, 00:18:52.798 { 00:18:52.798 "name": "BaseBdev3", 00:18:52.798 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:52.798 "is_configured": true, 00:18:52.798 "data_offset": 2048, 00:18:52.798 "data_size": 63488 00:18:52.798 }, 00:18:52.798 { 00:18:52.798 "name": "BaseBdev4", 00:18:52.798 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:52.798 "is_configured": true, 00:18:52.798 "data_offset": 2048, 00:18:52.798 "data_size": 63488 00:18:52.798 } 00:18:52.798 ] 00:18:52.798 }' 00:18:52.798 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:52.798 15:55:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:53.368 [2024-07-12 15:55:13.750448] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:53.368 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:53.368 "name": "Existed_Raid", 00:18:53.368 "aliases": [ 00:18:53.368 "cab15fe6-c657-4787-a035-4b790a5b838c" 00:18:53.368 ], 00:18:53.368 "product_name": "Raid Volume", 00:18:53.368 "block_size": 512, 00:18:53.368 "num_blocks": 253952, 00:18:53.368 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:53.368 "assigned_rate_limits": { 00:18:53.368 "rw_ios_per_sec": 0, 00:18:53.368 "rw_mbytes_per_sec": 0, 00:18:53.368 "r_mbytes_per_sec": 0, 00:18:53.368 "w_mbytes_per_sec": 0 00:18:53.368 }, 00:18:53.368 "claimed": false, 00:18:53.368 "zoned": false, 00:18:53.368 "supported_io_types": { 00:18:53.368 "read": true, 00:18:53.368 "write": true, 00:18:53.368 "unmap": true, 00:18:53.368 "flush": true, 00:18:53.368 "reset": true, 00:18:53.368 "nvme_admin": false, 00:18:53.368 "nvme_io": false, 00:18:53.368 "nvme_io_md": false, 00:18:53.368 "write_zeroes": true, 00:18:53.368 "zcopy": false, 00:18:53.368 "get_zone_info": false, 00:18:53.368 "zone_management": false, 00:18:53.368 "zone_append": false, 00:18:53.368 "compare": false, 00:18:53.368 "compare_and_write": false, 00:18:53.368 "abort": false, 00:18:53.368 "seek_hole": false, 00:18:53.368 "seek_data": false, 00:18:53.368 "copy": false, 00:18:53.368 "nvme_iov_md": false 00:18:53.368 }, 00:18:53.368 "memory_domains": [ 00:18:53.368 { 00:18:53.368 "dma_device_id": "system", 00:18:53.368 "dma_device_type": 1 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.368 "dma_device_type": 2 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "system", 00:18:53.368 "dma_device_type": 1 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.368 "dma_device_type": 2 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "system", 00:18:53.368 "dma_device_type": 1 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.368 "dma_device_type": 2 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "system", 00:18:53.368 "dma_device_type": 1 00:18:53.368 }, 00:18:53.368 { 00:18:53.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.368 "dma_device_type": 2 00:18:53.368 } 00:18:53.368 ], 00:18:53.368 "driver_specific": { 00:18:53.368 "raid": { 00:18:53.368 "uuid": "cab15fe6-c657-4787-a035-4b790a5b838c", 00:18:53.368 "strip_size_kb": 64, 00:18:53.368 "state": "online", 00:18:53.368 "raid_level": "raid0", 00:18:53.368 "superblock": true, 00:18:53.368 "num_base_bdevs": 4, 00:18:53.368 "num_base_bdevs_discovered": 4, 00:18:53.368 "num_base_bdevs_operational": 4, 00:18:53.368 "base_bdevs_list": [ 00:18:53.368 { 00:18:53.369 "name": "NewBaseBdev", 00:18:53.369 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:53.369 "is_configured": true, 00:18:53.369 "data_offset": 2048, 00:18:53.369 "data_size": 63488 00:18:53.369 }, 00:18:53.369 { 00:18:53.369 "name": "BaseBdev2", 00:18:53.369 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:53.369 "is_configured": true, 00:18:53.369 "data_offset": 2048, 00:18:53.369 "data_size": 63488 00:18:53.369 }, 00:18:53.369 { 00:18:53.369 "name": "BaseBdev3", 00:18:53.369 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:53.369 "is_configured": true, 00:18:53.369 "data_offset": 2048, 00:18:53.369 "data_size": 63488 00:18:53.369 }, 00:18:53.369 { 00:18:53.369 "name": "BaseBdev4", 00:18:53.369 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:53.369 "is_configured": true, 00:18:53.369 "data_offset": 2048, 00:18:53.369 "data_size": 63488 00:18:53.369 } 00:18:53.369 ] 00:18:53.369 } 00:18:53.369 } 00:18:53.369 }' 00:18:53.369 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:53.369 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:53.369 BaseBdev2 00:18:53.369 BaseBdev3 00:18:53.369 BaseBdev4' 00:18:53.369 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.628 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:53.628 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:53.628 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:53.628 "name": "NewBaseBdev", 00:18:53.628 "aliases": [ 00:18:53.628 "f1773972-1936-4285-b4a4-f74e7c370573" 00:18:53.628 ], 00:18:53.628 "product_name": "Malloc disk", 00:18:53.628 "block_size": 512, 00:18:53.628 "num_blocks": 65536, 00:18:53.628 "uuid": "f1773972-1936-4285-b4a4-f74e7c370573", 00:18:53.628 "assigned_rate_limits": { 00:18:53.628 "rw_ios_per_sec": 0, 00:18:53.628 "rw_mbytes_per_sec": 0, 00:18:53.628 "r_mbytes_per_sec": 0, 00:18:53.628 "w_mbytes_per_sec": 0 00:18:53.628 }, 00:18:53.628 "claimed": true, 00:18:53.628 "claim_type": "exclusive_write", 00:18:53.628 "zoned": false, 00:18:53.628 "supported_io_types": { 00:18:53.628 "read": true, 00:18:53.628 "write": true, 00:18:53.628 "unmap": true, 00:18:53.628 "flush": true, 00:18:53.628 "reset": true, 00:18:53.628 "nvme_admin": false, 00:18:53.628 "nvme_io": false, 00:18:53.628 "nvme_io_md": false, 00:18:53.628 "write_zeroes": true, 00:18:53.628 "zcopy": true, 00:18:53.628 "get_zone_info": false, 00:18:53.628 "zone_management": false, 00:18:53.629 "zone_append": false, 00:18:53.629 "compare": false, 00:18:53.629 "compare_and_write": false, 00:18:53.629 "abort": true, 00:18:53.629 "seek_hole": false, 00:18:53.629 "seek_data": false, 00:18:53.629 "copy": true, 00:18:53.629 "nvme_iov_md": false 00:18:53.629 }, 00:18:53.629 "memory_domains": [ 00:18:53.629 { 00:18:53.629 "dma_device_id": "system", 00:18:53.629 "dma_device_type": 1 00:18:53.629 }, 00:18:53.629 { 00:18:53.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.629 "dma_device_type": 2 00:18:53.629 } 00:18:53.629 ], 00:18:53.629 "driver_specific": {} 00:18:53.629 }' 00:18:53.629 15:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.629 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:53.629 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:53.629 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:53.888 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.148 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.148 "name": "BaseBdev2", 00:18:54.148 "aliases": [ 00:18:54.148 "f2a6c488-f4de-430b-a22e-697581e0ad6a" 00:18:54.148 ], 00:18:54.148 "product_name": "Malloc disk", 00:18:54.148 "block_size": 512, 00:18:54.148 "num_blocks": 65536, 00:18:54.148 "uuid": "f2a6c488-f4de-430b-a22e-697581e0ad6a", 00:18:54.148 "assigned_rate_limits": { 00:18:54.148 "rw_ios_per_sec": 0, 00:18:54.148 "rw_mbytes_per_sec": 0, 00:18:54.148 "r_mbytes_per_sec": 0, 00:18:54.148 "w_mbytes_per_sec": 0 00:18:54.148 }, 00:18:54.148 "claimed": true, 00:18:54.148 "claim_type": "exclusive_write", 00:18:54.148 "zoned": false, 00:18:54.148 "supported_io_types": { 00:18:54.148 "read": true, 00:18:54.148 "write": true, 00:18:54.148 "unmap": true, 00:18:54.148 "flush": true, 00:18:54.148 "reset": true, 00:18:54.148 "nvme_admin": false, 00:18:54.148 "nvme_io": false, 00:18:54.148 "nvme_io_md": false, 00:18:54.148 "write_zeroes": true, 00:18:54.148 "zcopy": true, 00:18:54.148 "get_zone_info": false, 00:18:54.148 "zone_management": false, 00:18:54.148 "zone_append": false, 00:18:54.148 "compare": false, 00:18:54.148 "compare_and_write": false, 00:18:54.148 "abort": true, 00:18:54.148 "seek_hole": false, 00:18:54.148 "seek_data": false, 00:18:54.148 "copy": true, 00:18:54.148 "nvme_iov_md": false 00:18:54.148 }, 00:18:54.148 "memory_domains": [ 00:18:54.148 { 00:18:54.148 "dma_device_id": "system", 00:18:54.148 "dma_device_type": 1 00:18:54.148 }, 00:18:54.148 { 00:18:54.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.148 "dma_device_type": 2 00:18:54.148 } 00:18:54.148 ], 00:18:54.148 "driver_specific": {} 00:18:54.148 }' 00:18:54.148 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.148 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.408 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.409 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:54.669 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:54.669 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:54.669 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:54.669 15:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:54.669 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:54.669 "name": "BaseBdev3", 00:18:54.669 "aliases": [ 00:18:54.669 "c3270ebf-07d2-48d8-b878-fb7f5f4a304f" 00:18:54.669 ], 00:18:54.669 "product_name": "Malloc disk", 00:18:54.669 "block_size": 512, 00:18:54.669 "num_blocks": 65536, 00:18:54.669 "uuid": "c3270ebf-07d2-48d8-b878-fb7f5f4a304f", 00:18:54.669 "assigned_rate_limits": { 00:18:54.669 "rw_ios_per_sec": 0, 00:18:54.669 "rw_mbytes_per_sec": 0, 00:18:54.669 "r_mbytes_per_sec": 0, 00:18:54.669 "w_mbytes_per_sec": 0 00:18:54.669 }, 00:18:54.669 "claimed": true, 00:18:54.669 "claim_type": "exclusive_write", 00:18:54.669 "zoned": false, 00:18:54.669 "supported_io_types": { 00:18:54.669 "read": true, 00:18:54.669 "write": true, 00:18:54.669 "unmap": true, 00:18:54.669 "flush": true, 00:18:54.669 "reset": true, 00:18:54.669 "nvme_admin": false, 00:18:54.669 "nvme_io": false, 00:18:54.669 "nvme_io_md": false, 00:18:54.669 "write_zeroes": true, 00:18:54.669 "zcopy": true, 00:18:54.669 "get_zone_info": false, 00:18:54.669 "zone_management": false, 00:18:54.669 "zone_append": false, 00:18:54.669 "compare": false, 00:18:54.669 "compare_and_write": false, 00:18:54.669 "abort": true, 00:18:54.669 "seek_hole": false, 00:18:54.669 "seek_data": false, 00:18:54.669 "copy": true, 00:18:54.669 "nvme_iov_md": false 00:18:54.669 }, 00:18:54.669 "memory_domains": [ 00:18:54.669 { 00:18:54.669 "dma_device_id": "system", 00:18:54.669 "dma_device_type": 1 00:18:54.669 }, 00:18:54.669 { 00:18:54.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:54.669 "dma_device_type": 2 00:18:54.669 } 00:18:54.669 ], 00:18:54.669 "driver_specific": {} 00:18:54.669 }' 00:18:54.669 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.669 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:54.929 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:55.190 "name": "BaseBdev4", 00:18:55.190 "aliases": [ 00:18:55.190 "fbf7ac42-4a08-4f79-b879-47d983786f25" 00:18:55.190 ], 00:18:55.190 "product_name": "Malloc disk", 00:18:55.190 "block_size": 512, 00:18:55.190 "num_blocks": 65536, 00:18:55.190 "uuid": "fbf7ac42-4a08-4f79-b879-47d983786f25", 00:18:55.190 "assigned_rate_limits": { 00:18:55.190 "rw_ios_per_sec": 0, 00:18:55.190 "rw_mbytes_per_sec": 0, 00:18:55.190 "r_mbytes_per_sec": 0, 00:18:55.190 "w_mbytes_per_sec": 0 00:18:55.190 }, 00:18:55.190 "claimed": true, 00:18:55.190 "claim_type": "exclusive_write", 00:18:55.190 "zoned": false, 00:18:55.190 "supported_io_types": { 00:18:55.190 "read": true, 00:18:55.190 "write": true, 00:18:55.190 "unmap": true, 00:18:55.190 "flush": true, 00:18:55.190 "reset": true, 00:18:55.190 "nvme_admin": false, 00:18:55.190 "nvme_io": false, 00:18:55.190 "nvme_io_md": false, 00:18:55.190 "write_zeroes": true, 00:18:55.190 "zcopy": true, 00:18:55.190 "get_zone_info": false, 00:18:55.190 "zone_management": false, 00:18:55.190 "zone_append": false, 00:18:55.190 "compare": false, 00:18:55.190 "compare_and_write": false, 00:18:55.190 "abort": true, 00:18:55.190 "seek_hole": false, 00:18:55.190 "seek_data": false, 00:18:55.190 "copy": true, 00:18:55.190 "nvme_iov_md": false 00:18:55.190 }, 00:18:55.190 "memory_domains": [ 00:18:55.190 { 00:18:55.190 "dma_device_id": "system", 00:18:55.190 "dma_device_type": 1 00:18:55.190 }, 00:18:55.190 { 00:18:55.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:55.190 "dma_device_type": 2 00:18:55.190 } 00:18:55.190 ], 00:18:55.190 "driver_specific": {} 00:18:55.190 }' 00:18:55.190 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:55.450 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.711 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:55.711 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:55.711 15:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:55.711 [2024-07-12 15:55:16.124211] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:55.711 [2024-07-12 15:55:16.124232] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:55.711 [2024-07-12 15:55:16.124272] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:55.711 [2024-07-12 15:55:16.124323] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:55.711 [2024-07-12 15:55:16.124330] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f76c90 name Existed_Raid, state offline 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2573352 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2573352 ']' 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2573352 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:55.711 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2573352 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2573352' 00:18:55.970 killing process with pid 2573352 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2573352 00:18:55.970 [2024-07-12 15:55:16.189727] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2573352 00:18:55.970 [2024-07-12 15:55:16.210133] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:55.970 00:18:55.970 real 0m30.687s 00:18:55.970 user 0m57.748s 00:18:55.970 sys 0m4.191s 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:55.970 15:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.970 ************************************ 00:18:55.970 END TEST raid_state_function_test_sb 00:18:55.970 ************************************ 00:18:55.970 15:55:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:55.970 15:55:16 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:55.970 15:55:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:55.970 15:55:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:55.970 15:55:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:55.970 ************************************ 00:18:55.970 START TEST raid_superblock_test 00:18:55.970 ************************************ 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:55.970 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2578951 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2578951 /var/tmp/spdk-raid.sock 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2578951 ']' 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:56.296 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:56.296 15:55:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.296 [2024-07-12 15:55:16.471651] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:18:56.296 [2024-07-12 15:55:16.471697] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2578951 ] 00:18:56.296 [2024-07-12 15:55:16.557676] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:56.296 [2024-07-12 15:55:16.627748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:56.296 [2024-07-12 15:55:16.677752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.296 [2024-07-12 15:55:16.677778] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:56.867 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:57.437 malloc1 00:18:57.437 15:55:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:57.697 [2024-07-12 15:55:18.025859] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:57.697 [2024-07-12 15:55:18.025893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.697 [2024-07-12 15:55:18.025907] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1703b50 00:18:57.697 [2024-07-12 15:55:18.025914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.697 [2024-07-12 15:55:18.027204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.697 [2024-07-12 15:55:18.027224] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:57.697 pt1 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:57.697 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:58.266 malloc2 00:18:58.266 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:58.525 [2024-07-12 15:55:18.761657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:58.525 [2024-07-12 15:55:18.761688] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:58.525 [2024-07-12 15:55:18.761698] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1704df0 00:18:58.525 [2024-07-12 15:55:18.761704] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:58.525 [2024-07-12 15:55:18.762907] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:58.525 [2024-07-12 15:55:18.762926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:58.525 pt2 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:58.525 15:55:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:59.095 malloc3 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:59.095 [2024-07-12 15:55:19.497396] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:59.095 [2024-07-12 15:55:19.497425] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.095 [2024-07-12 15:55:19.497435] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1704770 00:18:59.095 [2024-07-12 15:55:19.497441] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.095 [2024-07-12 15:55:19.498638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.095 [2024-07-12 15:55:19.498657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:59.095 pt3 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:59.095 15:55:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:59.665 malloc4 00:18:59.665 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:59.924 [2024-07-12 15:55:20.233648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:59.924 [2024-07-12 15:55:20.233684] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.924 [2024-07-12 15:55:20.233697] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fb840 00:18:59.924 [2024-07-12 15:55:20.233703] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.924 [2024-07-12 15:55:20.234923] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.924 [2024-07-12 15:55:20.234942] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:59.924 pt4 00:18:59.924 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:59.924 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:59.924 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:00.492 [2024-07-12 15:55:20.758982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:00.492 [2024-07-12 15:55:20.760014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:00.492 [2024-07-12 15:55:20.760058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:00.492 [2024-07-12 15:55:20.760094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:00.492 [2024-07-12 15:55:20.760229] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18b54c0 00:19:00.492 [2024-07-12 15:55:20.760240] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:00.492 [2024-07-12 15:55:20.760392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1704570 00:19:00.492 [2024-07-12 15:55:20.760507] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18b54c0 00:19:00.492 [2024-07-12 15:55:20.760513] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18b54c0 00:19:00.492 [2024-07-12 15:55:20.760588] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.492 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:00.753 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:00.753 "name": "raid_bdev1", 00:19:00.753 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:00.753 "strip_size_kb": 64, 00:19:00.753 "state": "online", 00:19:00.753 "raid_level": "raid0", 00:19:00.753 "superblock": true, 00:19:00.753 "num_base_bdevs": 4, 00:19:00.753 "num_base_bdevs_discovered": 4, 00:19:00.753 "num_base_bdevs_operational": 4, 00:19:00.753 "base_bdevs_list": [ 00:19:00.753 { 00:19:00.753 "name": "pt1", 00:19:00.753 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:00.753 "is_configured": true, 00:19:00.753 "data_offset": 2048, 00:19:00.753 "data_size": 63488 00:19:00.753 }, 00:19:00.753 { 00:19:00.753 "name": "pt2", 00:19:00.753 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:00.753 "is_configured": true, 00:19:00.753 "data_offset": 2048, 00:19:00.753 "data_size": 63488 00:19:00.753 }, 00:19:00.753 { 00:19:00.753 "name": "pt3", 00:19:00.753 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:00.753 "is_configured": true, 00:19:00.753 "data_offset": 2048, 00:19:00.753 "data_size": 63488 00:19:00.753 }, 00:19:00.753 { 00:19:00.753 "name": "pt4", 00:19:00.753 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:00.753 "is_configured": true, 00:19:00.753 "data_offset": 2048, 00:19:00.753 "data_size": 63488 00:19:00.753 } 00:19:00.753 ] 00:19:00.753 }' 00:19:00.753 15:55:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:00.753 15:55:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:01.323 [2024-07-12 15:55:21.685545] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:01.323 "name": "raid_bdev1", 00:19:01.323 "aliases": [ 00:19:01.323 "e00c030b-1921-4dd5-8da6-effc40918467" 00:19:01.323 ], 00:19:01.323 "product_name": "Raid Volume", 00:19:01.323 "block_size": 512, 00:19:01.323 "num_blocks": 253952, 00:19:01.323 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:01.323 "assigned_rate_limits": { 00:19:01.323 "rw_ios_per_sec": 0, 00:19:01.323 "rw_mbytes_per_sec": 0, 00:19:01.323 "r_mbytes_per_sec": 0, 00:19:01.323 "w_mbytes_per_sec": 0 00:19:01.323 }, 00:19:01.323 "claimed": false, 00:19:01.323 "zoned": false, 00:19:01.323 "supported_io_types": { 00:19:01.323 "read": true, 00:19:01.323 "write": true, 00:19:01.323 "unmap": true, 00:19:01.323 "flush": true, 00:19:01.323 "reset": true, 00:19:01.323 "nvme_admin": false, 00:19:01.323 "nvme_io": false, 00:19:01.323 "nvme_io_md": false, 00:19:01.323 "write_zeroes": true, 00:19:01.323 "zcopy": false, 00:19:01.323 "get_zone_info": false, 00:19:01.323 "zone_management": false, 00:19:01.323 "zone_append": false, 00:19:01.323 "compare": false, 00:19:01.323 "compare_and_write": false, 00:19:01.323 "abort": false, 00:19:01.323 "seek_hole": false, 00:19:01.323 "seek_data": false, 00:19:01.323 "copy": false, 00:19:01.323 "nvme_iov_md": false 00:19:01.323 }, 00:19:01.323 "memory_domains": [ 00:19:01.323 { 00:19:01.323 "dma_device_id": "system", 00:19:01.323 "dma_device_type": 1 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.323 "dma_device_type": 2 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "system", 00:19:01.323 "dma_device_type": 1 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.323 "dma_device_type": 2 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "system", 00:19:01.323 "dma_device_type": 1 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.323 "dma_device_type": 2 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "system", 00:19:01.323 "dma_device_type": 1 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.323 "dma_device_type": 2 00:19:01.323 } 00:19:01.323 ], 00:19:01.323 "driver_specific": { 00:19:01.323 "raid": { 00:19:01.323 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:01.323 "strip_size_kb": 64, 00:19:01.323 "state": "online", 00:19:01.323 "raid_level": "raid0", 00:19:01.323 "superblock": true, 00:19:01.323 "num_base_bdevs": 4, 00:19:01.323 "num_base_bdevs_discovered": 4, 00:19:01.323 "num_base_bdevs_operational": 4, 00:19:01.323 "base_bdevs_list": [ 00:19:01.323 { 00:19:01.323 "name": "pt1", 00:19:01.323 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:01.323 "is_configured": true, 00:19:01.323 "data_offset": 2048, 00:19:01.323 "data_size": 63488 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "name": "pt2", 00:19:01.323 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:01.323 "is_configured": true, 00:19:01.323 "data_offset": 2048, 00:19:01.323 "data_size": 63488 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "name": "pt3", 00:19:01.323 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:01.323 "is_configured": true, 00:19:01.323 "data_offset": 2048, 00:19:01.323 "data_size": 63488 00:19:01.323 }, 00:19:01.323 { 00:19:01.323 "name": "pt4", 00:19:01.323 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:01.323 "is_configured": true, 00:19:01.323 "data_offset": 2048, 00:19:01.323 "data_size": 63488 00:19:01.323 } 00:19:01.323 ] 00:19:01.323 } 00:19:01.323 } 00:19:01.323 }' 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:01.323 pt2 00:19:01.323 pt3 00:19:01.323 pt4' 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:01.323 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:01.582 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:01.582 "name": "pt1", 00:19:01.582 "aliases": [ 00:19:01.582 "00000000-0000-0000-0000-000000000001" 00:19:01.582 ], 00:19:01.582 "product_name": "passthru", 00:19:01.582 "block_size": 512, 00:19:01.582 "num_blocks": 65536, 00:19:01.582 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:01.582 "assigned_rate_limits": { 00:19:01.582 "rw_ios_per_sec": 0, 00:19:01.582 "rw_mbytes_per_sec": 0, 00:19:01.583 "r_mbytes_per_sec": 0, 00:19:01.583 "w_mbytes_per_sec": 0 00:19:01.583 }, 00:19:01.583 "claimed": true, 00:19:01.583 "claim_type": "exclusive_write", 00:19:01.583 "zoned": false, 00:19:01.583 "supported_io_types": { 00:19:01.583 "read": true, 00:19:01.583 "write": true, 00:19:01.583 "unmap": true, 00:19:01.583 "flush": true, 00:19:01.583 "reset": true, 00:19:01.583 "nvme_admin": false, 00:19:01.583 "nvme_io": false, 00:19:01.583 "nvme_io_md": false, 00:19:01.583 "write_zeroes": true, 00:19:01.583 "zcopy": true, 00:19:01.583 "get_zone_info": false, 00:19:01.583 "zone_management": false, 00:19:01.583 "zone_append": false, 00:19:01.583 "compare": false, 00:19:01.583 "compare_and_write": false, 00:19:01.583 "abort": true, 00:19:01.583 "seek_hole": false, 00:19:01.583 "seek_data": false, 00:19:01.583 "copy": true, 00:19:01.583 "nvme_iov_md": false 00:19:01.583 }, 00:19:01.583 "memory_domains": [ 00:19:01.583 { 00:19:01.583 "dma_device_id": "system", 00:19:01.583 "dma_device_type": 1 00:19:01.583 }, 00:19:01.583 { 00:19:01.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:01.583 "dma_device_type": 2 00:19:01.583 } 00:19:01.583 ], 00:19:01.583 "driver_specific": { 00:19:01.583 "passthru": { 00:19:01.583 "name": "pt1", 00:19:01.583 "base_bdev_name": "malloc1" 00:19:01.583 } 00:19:01.583 } 00:19:01.583 }' 00:19:01.583 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.583 15:55:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:01.583 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:01.583 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:01.842 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.102 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.102 "name": "pt2", 00:19:02.102 "aliases": [ 00:19:02.102 "00000000-0000-0000-0000-000000000002" 00:19:02.102 ], 00:19:02.102 "product_name": "passthru", 00:19:02.102 "block_size": 512, 00:19:02.102 "num_blocks": 65536, 00:19:02.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:02.102 "assigned_rate_limits": { 00:19:02.102 "rw_ios_per_sec": 0, 00:19:02.102 "rw_mbytes_per_sec": 0, 00:19:02.102 "r_mbytes_per_sec": 0, 00:19:02.102 "w_mbytes_per_sec": 0 00:19:02.102 }, 00:19:02.102 "claimed": true, 00:19:02.102 "claim_type": "exclusive_write", 00:19:02.102 "zoned": false, 00:19:02.102 "supported_io_types": { 00:19:02.102 "read": true, 00:19:02.102 "write": true, 00:19:02.102 "unmap": true, 00:19:02.102 "flush": true, 00:19:02.102 "reset": true, 00:19:02.102 "nvme_admin": false, 00:19:02.102 "nvme_io": false, 00:19:02.102 "nvme_io_md": false, 00:19:02.102 "write_zeroes": true, 00:19:02.102 "zcopy": true, 00:19:02.102 "get_zone_info": false, 00:19:02.102 "zone_management": false, 00:19:02.102 "zone_append": false, 00:19:02.102 "compare": false, 00:19:02.102 "compare_and_write": false, 00:19:02.102 "abort": true, 00:19:02.102 "seek_hole": false, 00:19:02.102 "seek_data": false, 00:19:02.102 "copy": true, 00:19:02.102 "nvme_iov_md": false 00:19:02.102 }, 00:19:02.102 "memory_domains": [ 00:19:02.102 { 00:19:02.102 "dma_device_id": "system", 00:19:02.102 "dma_device_type": 1 00:19:02.102 }, 00:19:02.102 { 00:19:02.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.102 "dma_device_type": 2 00:19:02.102 } 00:19:02.102 ], 00:19:02.102 "driver_specific": { 00:19:02.102 "passthru": { 00:19:02.102 "name": "pt2", 00:19:02.102 "base_bdev_name": "malloc2" 00:19:02.102 } 00:19:02.102 } 00:19:02.102 }' 00:19:02.102 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.102 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.362 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.622 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.622 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.622 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:02.622 15:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.622 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.622 "name": "pt3", 00:19:02.622 "aliases": [ 00:19:02.622 "00000000-0000-0000-0000-000000000003" 00:19:02.622 ], 00:19:02.622 "product_name": "passthru", 00:19:02.622 "block_size": 512, 00:19:02.622 "num_blocks": 65536, 00:19:02.622 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:02.622 "assigned_rate_limits": { 00:19:02.622 "rw_ios_per_sec": 0, 00:19:02.622 "rw_mbytes_per_sec": 0, 00:19:02.622 "r_mbytes_per_sec": 0, 00:19:02.622 "w_mbytes_per_sec": 0 00:19:02.622 }, 00:19:02.622 "claimed": true, 00:19:02.622 "claim_type": "exclusive_write", 00:19:02.622 "zoned": false, 00:19:02.622 "supported_io_types": { 00:19:02.622 "read": true, 00:19:02.622 "write": true, 00:19:02.622 "unmap": true, 00:19:02.622 "flush": true, 00:19:02.622 "reset": true, 00:19:02.622 "nvme_admin": false, 00:19:02.622 "nvme_io": false, 00:19:02.622 "nvme_io_md": false, 00:19:02.622 "write_zeroes": true, 00:19:02.622 "zcopy": true, 00:19:02.622 "get_zone_info": false, 00:19:02.622 "zone_management": false, 00:19:02.622 "zone_append": false, 00:19:02.622 "compare": false, 00:19:02.622 "compare_and_write": false, 00:19:02.622 "abort": true, 00:19:02.622 "seek_hole": false, 00:19:02.622 "seek_data": false, 00:19:02.622 "copy": true, 00:19:02.622 "nvme_iov_md": false 00:19:02.622 }, 00:19:02.622 "memory_domains": [ 00:19:02.622 { 00:19:02.622 "dma_device_id": "system", 00:19:02.622 "dma_device_type": 1 00:19:02.622 }, 00:19:02.622 { 00:19:02.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.622 "dma_device_type": 2 00:19:02.622 } 00:19:02.622 ], 00:19:02.622 "driver_specific": { 00:19:02.622 "passthru": { 00:19:02.622 "name": "pt3", 00:19:02.622 "base_bdev_name": "malloc3" 00:19:02.622 } 00:19:02.622 } 00:19:02.622 }' 00:19:02.622 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.622 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.882 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.142 "name": "pt4", 00:19:03.142 "aliases": [ 00:19:03.142 "00000000-0000-0000-0000-000000000004" 00:19:03.142 ], 00:19:03.142 "product_name": "passthru", 00:19:03.142 "block_size": 512, 00:19:03.142 "num_blocks": 65536, 00:19:03.142 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:03.142 "assigned_rate_limits": { 00:19:03.142 "rw_ios_per_sec": 0, 00:19:03.142 "rw_mbytes_per_sec": 0, 00:19:03.142 "r_mbytes_per_sec": 0, 00:19:03.142 "w_mbytes_per_sec": 0 00:19:03.142 }, 00:19:03.142 "claimed": true, 00:19:03.142 "claim_type": "exclusive_write", 00:19:03.142 "zoned": false, 00:19:03.142 "supported_io_types": { 00:19:03.142 "read": true, 00:19:03.142 "write": true, 00:19:03.142 "unmap": true, 00:19:03.142 "flush": true, 00:19:03.142 "reset": true, 00:19:03.142 "nvme_admin": false, 00:19:03.142 "nvme_io": false, 00:19:03.142 "nvme_io_md": false, 00:19:03.142 "write_zeroes": true, 00:19:03.142 "zcopy": true, 00:19:03.142 "get_zone_info": false, 00:19:03.142 "zone_management": false, 00:19:03.142 "zone_append": false, 00:19:03.142 "compare": false, 00:19:03.142 "compare_and_write": false, 00:19:03.142 "abort": true, 00:19:03.142 "seek_hole": false, 00:19:03.142 "seek_data": false, 00:19:03.142 "copy": true, 00:19:03.142 "nvme_iov_md": false 00:19:03.142 }, 00:19:03.142 "memory_domains": [ 00:19:03.142 { 00:19:03.142 "dma_device_id": "system", 00:19:03.142 "dma_device_type": 1 00:19:03.142 }, 00:19:03.142 { 00:19:03.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.142 "dma_device_type": 2 00:19:03.142 } 00:19:03.142 ], 00:19:03.142 "driver_specific": { 00:19:03.142 "passthru": { 00:19:03.142 "name": "pt4", 00:19:03.142 "base_bdev_name": "malloc4" 00:19:03.142 } 00:19:03.142 } 00:19:03.142 }' 00:19:03.142 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.402 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.662 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.662 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.662 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:03.662 15:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:03.662 [2024-07-12 15:55:24.095688] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:03.922 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=e00c030b-1921-4dd5-8da6-effc40918467 00:19:03.922 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z e00c030b-1921-4dd5-8da6-effc40918467 ']' 00:19:03.922 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:04.181 [2024-07-12 15:55:24.620778] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:04.181 [2024-07-12 15:55:24.620792] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.181 [2024-07-12 15:55:24.620830] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.181 [2024-07-12 15:55:24.620878] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:04.181 [2024-07-12 15:55:24.620885] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18b54c0 name raid_bdev1, state offline 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:04.441 15:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:05.010 15:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:05.010 15:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:05.270 15:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:05.270 15:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:05.839 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:05.839 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:06.099 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:06.359 [2024-07-12 15:55:26.669886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:06.359 [2024-07-12 15:55:26.670955] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:06.359 [2024-07-12 15:55:26.670990] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:06.359 [2024-07-12 15:55:26.671016] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:06.359 [2024-07-12 15:55:26.671050] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:06.359 [2024-07-12 15:55:26.671077] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:06.359 [2024-07-12 15:55:26.671091] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:06.359 [2024-07-12 15:55:26.671104] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:06.359 [2024-07-12 15:55:26.671114] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:06.359 [2024-07-12 15:55:26.671120] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18aa0f0 name raid_bdev1, state configuring 00:19:06.359 request: 00:19:06.359 { 00:19:06.359 "name": "raid_bdev1", 00:19:06.359 "raid_level": "raid0", 00:19:06.359 "base_bdevs": [ 00:19:06.359 "malloc1", 00:19:06.359 "malloc2", 00:19:06.359 "malloc3", 00:19:06.359 "malloc4" 00:19:06.359 ], 00:19:06.359 "strip_size_kb": 64, 00:19:06.359 "superblock": false, 00:19:06.359 "method": "bdev_raid_create", 00:19:06.359 "req_id": 1 00:19:06.359 } 00:19:06.359 Got JSON-RPC error response 00:19:06.359 response: 00:19:06.359 { 00:19:06.359 "code": -17, 00:19:06.359 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:06.359 } 00:19:06.359 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:19:06.359 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:19:06.359 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:19:06.360 15:55:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:19:06.360 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.360 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:06.620 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:06.620 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:06.620 15:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:06.620 [2024-07-12 15:55:27.038768] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:06.620 [2024-07-12 15:55:27.038795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.620 [2024-07-12 15:55:27.038807] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18accc0 00:19:06.620 [2024-07-12 15:55:27.038813] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.620 [2024-07-12 15:55:27.040067] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.620 [2024-07-12 15:55:27.040086] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:06.620 [2024-07-12 15:55:27.040128] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:06.620 [2024-07-12 15:55:27.040146] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:06.620 pt1 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:06.620 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:06.880 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:06.880 "name": "raid_bdev1", 00:19:06.880 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:06.880 "strip_size_kb": 64, 00:19:06.880 "state": "configuring", 00:19:06.880 "raid_level": "raid0", 00:19:06.880 "superblock": true, 00:19:06.880 "num_base_bdevs": 4, 00:19:06.880 "num_base_bdevs_discovered": 1, 00:19:06.880 "num_base_bdevs_operational": 4, 00:19:06.880 "base_bdevs_list": [ 00:19:06.880 { 00:19:06.880 "name": "pt1", 00:19:06.880 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:06.880 "is_configured": true, 00:19:06.880 "data_offset": 2048, 00:19:06.880 "data_size": 63488 00:19:06.880 }, 00:19:06.880 { 00:19:06.880 "name": null, 00:19:06.880 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:06.880 "is_configured": false, 00:19:06.880 "data_offset": 2048, 00:19:06.880 "data_size": 63488 00:19:06.880 }, 00:19:06.880 { 00:19:06.880 "name": null, 00:19:06.880 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:06.880 "is_configured": false, 00:19:06.880 "data_offset": 2048, 00:19:06.880 "data_size": 63488 00:19:06.880 }, 00:19:06.880 { 00:19:06.880 "name": null, 00:19:06.880 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:06.880 "is_configured": false, 00:19:06.880 "data_offset": 2048, 00:19:06.880 "data_size": 63488 00:19:06.880 } 00:19:06.880 ] 00:19:06.880 }' 00:19:06.880 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:06.880 15:55:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:07.448 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:07.448 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:07.708 [2024-07-12 15:55:27.953100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:07.708 [2024-07-12 15:55:27.953136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.708 [2024-07-12 15:55:27.953150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fc8f0 00:19:07.708 [2024-07-12 15:55:27.953157] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.708 [2024-07-12 15:55:27.953423] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.708 [2024-07-12 15:55:27.953434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:07.708 [2024-07-12 15:55:27.953477] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:07.708 [2024-07-12 15:55:27.953491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:07.708 pt2 00:19:07.708 15:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:07.708 [2024-07-12 15:55:28.149595] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.968 "name": "raid_bdev1", 00:19:07.968 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:07.968 "strip_size_kb": 64, 00:19:07.968 "state": "configuring", 00:19:07.968 "raid_level": "raid0", 00:19:07.968 "superblock": true, 00:19:07.968 "num_base_bdevs": 4, 00:19:07.968 "num_base_bdevs_discovered": 1, 00:19:07.968 "num_base_bdevs_operational": 4, 00:19:07.968 "base_bdevs_list": [ 00:19:07.968 { 00:19:07.968 "name": "pt1", 00:19:07.968 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:07.968 "is_configured": true, 00:19:07.968 "data_offset": 2048, 00:19:07.968 "data_size": 63488 00:19:07.968 }, 00:19:07.968 { 00:19:07.968 "name": null, 00:19:07.968 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:07.968 "is_configured": false, 00:19:07.968 "data_offset": 2048, 00:19:07.968 "data_size": 63488 00:19:07.968 }, 00:19:07.968 { 00:19:07.968 "name": null, 00:19:07.968 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:07.968 "is_configured": false, 00:19:07.968 "data_offset": 2048, 00:19:07.968 "data_size": 63488 00:19:07.968 }, 00:19:07.968 { 00:19:07.968 "name": null, 00:19:07.968 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:07.968 "is_configured": false, 00:19:07.968 "data_offset": 2048, 00:19:07.968 "data_size": 63488 00:19:07.968 } 00:19:07.968 ] 00:19:07.968 }' 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.968 15:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.537 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:08.537 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:08.537 15:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:09.106 [2024-07-12 15:55:29.408799] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:09.106 [2024-07-12 15:55:29.408835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.106 [2024-07-12 15:55:29.408846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16fcf50 00:19:09.106 [2024-07-12 15:55:29.408853] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.106 [2024-07-12 15:55:29.409119] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.106 [2024-07-12 15:55:29.409131] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:09.106 [2024-07-12 15:55:29.409176] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:09.106 [2024-07-12 15:55:29.409190] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:09.106 pt2 00:19:09.106 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:09.106 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:09.106 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:09.365 [2024-07-12 15:55:29.617329] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:09.365 [2024-07-12 15:55:29.617358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.365 [2024-07-12 15:55:29.617368] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a72f0 00:19:09.365 [2024-07-12 15:55:29.617373] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.365 [2024-07-12 15:55:29.617623] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.365 [2024-07-12 15:55:29.617633] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:09.365 [2024-07-12 15:55:29.617671] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:09.365 [2024-07-12 15:55:29.617682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:09.365 pt3 00:19:09.365 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:09.365 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:09.365 15:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:09.936 [2024-07-12 15:55:30.142660] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:09.936 [2024-07-12 15:55:30.142697] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.936 [2024-07-12 15:55:30.142716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a82e0 00:19:09.936 [2024-07-12 15:55:30.142724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.936 [2024-07-12 15:55:30.142979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.936 [2024-07-12 15:55:30.142990] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:09.936 [2024-07-12 15:55:30.143036] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:09.936 [2024-07-12 15:55:30.143048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:09.936 [2024-07-12 15:55:30.143144] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16fa6e0 00:19:09.936 [2024-07-12 15:55:30.143151] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:09.936 [2024-07-12 15:55:30.143284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1701130 00:19:09.936 [2024-07-12 15:55:30.143384] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16fa6e0 00:19:09.936 [2024-07-12 15:55:30.143389] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16fa6e0 00:19:09.936 [2024-07-12 15:55:30.143460] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:09.936 pt4 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.936 "name": "raid_bdev1", 00:19:09.936 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:09.936 "strip_size_kb": 64, 00:19:09.936 "state": "online", 00:19:09.936 "raid_level": "raid0", 00:19:09.936 "superblock": true, 00:19:09.936 "num_base_bdevs": 4, 00:19:09.936 "num_base_bdevs_discovered": 4, 00:19:09.936 "num_base_bdevs_operational": 4, 00:19:09.936 "base_bdevs_list": [ 00:19:09.936 { 00:19:09.936 "name": "pt1", 00:19:09.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:09.936 "is_configured": true, 00:19:09.936 "data_offset": 2048, 00:19:09.936 "data_size": 63488 00:19:09.936 }, 00:19:09.936 { 00:19:09.936 "name": "pt2", 00:19:09.936 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:09.936 "is_configured": true, 00:19:09.936 "data_offset": 2048, 00:19:09.936 "data_size": 63488 00:19:09.936 }, 00:19:09.936 { 00:19:09.936 "name": "pt3", 00:19:09.936 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:09.936 "is_configured": true, 00:19:09.936 "data_offset": 2048, 00:19:09.936 "data_size": 63488 00:19:09.936 }, 00:19:09.936 { 00:19:09.936 "name": "pt4", 00:19:09.936 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:09.936 "is_configured": true, 00:19:09.936 "data_offset": 2048, 00:19:09.936 "data_size": 63488 00:19:09.936 } 00:19:09.936 ] 00:19:09.936 }' 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.936 15:55:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:10.505 15:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:10.765 [2024-07-12 15:55:31.085312] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:10.765 "name": "raid_bdev1", 00:19:10.765 "aliases": [ 00:19:10.765 "e00c030b-1921-4dd5-8da6-effc40918467" 00:19:10.765 ], 00:19:10.765 "product_name": "Raid Volume", 00:19:10.765 "block_size": 512, 00:19:10.765 "num_blocks": 253952, 00:19:10.765 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:10.765 "assigned_rate_limits": { 00:19:10.765 "rw_ios_per_sec": 0, 00:19:10.765 "rw_mbytes_per_sec": 0, 00:19:10.765 "r_mbytes_per_sec": 0, 00:19:10.765 "w_mbytes_per_sec": 0 00:19:10.765 }, 00:19:10.765 "claimed": false, 00:19:10.765 "zoned": false, 00:19:10.765 "supported_io_types": { 00:19:10.765 "read": true, 00:19:10.765 "write": true, 00:19:10.765 "unmap": true, 00:19:10.765 "flush": true, 00:19:10.765 "reset": true, 00:19:10.765 "nvme_admin": false, 00:19:10.765 "nvme_io": false, 00:19:10.765 "nvme_io_md": false, 00:19:10.765 "write_zeroes": true, 00:19:10.765 "zcopy": false, 00:19:10.765 "get_zone_info": false, 00:19:10.765 "zone_management": false, 00:19:10.765 "zone_append": false, 00:19:10.765 "compare": false, 00:19:10.765 "compare_and_write": false, 00:19:10.765 "abort": false, 00:19:10.765 "seek_hole": false, 00:19:10.765 "seek_data": false, 00:19:10.765 "copy": false, 00:19:10.765 "nvme_iov_md": false 00:19:10.765 }, 00:19:10.765 "memory_domains": [ 00:19:10.765 { 00:19:10.765 "dma_device_id": "system", 00:19:10.765 "dma_device_type": 1 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.765 "dma_device_type": 2 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "system", 00:19:10.765 "dma_device_type": 1 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.765 "dma_device_type": 2 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "system", 00:19:10.765 "dma_device_type": 1 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.765 "dma_device_type": 2 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "system", 00:19:10.765 "dma_device_type": 1 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.765 "dma_device_type": 2 00:19:10.765 } 00:19:10.765 ], 00:19:10.765 "driver_specific": { 00:19:10.765 "raid": { 00:19:10.765 "uuid": "e00c030b-1921-4dd5-8da6-effc40918467", 00:19:10.765 "strip_size_kb": 64, 00:19:10.765 "state": "online", 00:19:10.765 "raid_level": "raid0", 00:19:10.765 "superblock": true, 00:19:10.765 "num_base_bdevs": 4, 00:19:10.765 "num_base_bdevs_discovered": 4, 00:19:10.765 "num_base_bdevs_operational": 4, 00:19:10.765 "base_bdevs_list": [ 00:19:10.765 { 00:19:10.765 "name": "pt1", 00:19:10.765 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:10.765 "is_configured": true, 00:19:10.765 "data_offset": 2048, 00:19:10.765 "data_size": 63488 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "name": "pt2", 00:19:10.765 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:10.765 "is_configured": true, 00:19:10.765 "data_offset": 2048, 00:19:10.765 "data_size": 63488 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "name": "pt3", 00:19:10.765 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:10.765 "is_configured": true, 00:19:10.765 "data_offset": 2048, 00:19:10.765 "data_size": 63488 00:19:10.765 }, 00:19:10.765 { 00:19:10.765 "name": "pt4", 00:19:10.765 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:10.765 "is_configured": true, 00:19:10.765 "data_offset": 2048, 00:19:10.765 "data_size": 63488 00:19:10.765 } 00:19:10.765 ] 00:19:10.765 } 00:19:10.765 } 00:19:10.765 }' 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:10.765 pt2 00:19:10.765 pt3 00:19:10.765 pt4' 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:10.765 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.025 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.025 "name": "pt1", 00:19:11.025 "aliases": [ 00:19:11.025 "00000000-0000-0000-0000-000000000001" 00:19:11.025 ], 00:19:11.025 "product_name": "passthru", 00:19:11.025 "block_size": 512, 00:19:11.025 "num_blocks": 65536, 00:19:11.025 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:11.025 "assigned_rate_limits": { 00:19:11.025 "rw_ios_per_sec": 0, 00:19:11.025 "rw_mbytes_per_sec": 0, 00:19:11.025 "r_mbytes_per_sec": 0, 00:19:11.025 "w_mbytes_per_sec": 0 00:19:11.025 }, 00:19:11.025 "claimed": true, 00:19:11.025 "claim_type": "exclusive_write", 00:19:11.025 "zoned": false, 00:19:11.025 "supported_io_types": { 00:19:11.025 "read": true, 00:19:11.025 "write": true, 00:19:11.025 "unmap": true, 00:19:11.025 "flush": true, 00:19:11.025 "reset": true, 00:19:11.025 "nvme_admin": false, 00:19:11.025 "nvme_io": false, 00:19:11.025 "nvme_io_md": false, 00:19:11.025 "write_zeroes": true, 00:19:11.025 "zcopy": true, 00:19:11.025 "get_zone_info": false, 00:19:11.025 "zone_management": false, 00:19:11.025 "zone_append": false, 00:19:11.025 "compare": false, 00:19:11.025 "compare_and_write": false, 00:19:11.025 "abort": true, 00:19:11.025 "seek_hole": false, 00:19:11.025 "seek_data": false, 00:19:11.025 "copy": true, 00:19:11.025 "nvme_iov_md": false 00:19:11.025 }, 00:19:11.025 "memory_domains": [ 00:19:11.025 { 00:19:11.025 "dma_device_id": "system", 00:19:11.025 "dma_device_type": 1 00:19:11.025 }, 00:19:11.025 { 00:19:11.025 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.025 "dma_device_type": 2 00:19:11.025 } 00:19:11.025 ], 00:19:11.025 "driver_specific": { 00:19:11.025 "passthru": { 00:19:11.025 "name": "pt1", 00:19:11.025 "base_bdev_name": "malloc1" 00:19:11.025 } 00:19:11.025 } 00:19:11.025 }' 00:19:11.025 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.025 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.025 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.025 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:11.284 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.544 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.544 "name": "pt2", 00:19:11.544 "aliases": [ 00:19:11.544 "00000000-0000-0000-0000-000000000002" 00:19:11.544 ], 00:19:11.544 "product_name": "passthru", 00:19:11.544 "block_size": 512, 00:19:11.544 "num_blocks": 65536, 00:19:11.544 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:11.544 "assigned_rate_limits": { 00:19:11.544 "rw_ios_per_sec": 0, 00:19:11.544 "rw_mbytes_per_sec": 0, 00:19:11.544 "r_mbytes_per_sec": 0, 00:19:11.544 "w_mbytes_per_sec": 0 00:19:11.544 }, 00:19:11.544 "claimed": true, 00:19:11.544 "claim_type": "exclusive_write", 00:19:11.544 "zoned": false, 00:19:11.544 "supported_io_types": { 00:19:11.544 "read": true, 00:19:11.544 "write": true, 00:19:11.544 "unmap": true, 00:19:11.545 "flush": true, 00:19:11.545 "reset": true, 00:19:11.545 "nvme_admin": false, 00:19:11.545 "nvme_io": false, 00:19:11.545 "nvme_io_md": false, 00:19:11.545 "write_zeroes": true, 00:19:11.545 "zcopy": true, 00:19:11.545 "get_zone_info": false, 00:19:11.545 "zone_management": false, 00:19:11.545 "zone_append": false, 00:19:11.545 "compare": false, 00:19:11.545 "compare_and_write": false, 00:19:11.545 "abort": true, 00:19:11.545 "seek_hole": false, 00:19:11.545 "seek_data": false, 00:19:11.545 "copy": true, 00:19:11.545 "nvme_iov_md": false 00:19:11.545 }, 00:19:11.545 "memory_domains": [ 00:19:11.545 { 00:19:11.545 "dma_device_id": "system", 00:19:11.545 "dma_device_type": 1 00:19:11.545 }, 00:19:11.545 { 00:19:11.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.545 "dma_device_type": 2 00:19:11.545 } 00:19:11.545 ], 00:19:11.545 "driver_specific": { 00:19:11.545 "passthru": { 00:19:11.545 "name": "pt2", 00:19:11.545 "base_bdev_name": "malloc2" 00:19:11.545 } 00:19:11.545 } 00:19:11.545 }' 00:19:11.545 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.545 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.545 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.545 15:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:11.804 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.064 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.064 "name": "pt3", 00:19:12.064 "aliases": [ 00:19:12.064 "00000000-0000-0000-0000-000000000003" 00:19:12.064 ], 00:19:12.064 "product_name": "passthru", 00:19:12.064 "block_size": 512, 00:19:12.064 "num_blocks": 65536, 00:19:12.064 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:12.064 "assigned_rate_limits": { 00:19:12.064 "rw_ios_per_sec": 0, 00:19:12.064 "rw_mbytes_per_sec": 0, 00:19:12.064 "r_mbytes_per_sec": 0, 00:19:12.064 "w_mbytes_per_sec": 0 00:19:12.064 }, 00:19:12.064 "claimed": true, 00:19:12.064 "claim_type": "exclusive_write", 00:19:12.064 "zoned": false, 00:19:12.064 "supported_io_types": { 00:19:12.064 "read": true, 00:19:12.064 "write": true, 00:19:12.064 "unmap": true, 00:19:12.064 "flush": true, 00:19:12.064 "reset": true, 00:19:12.064 "nvme_admin": false, 00:19:12.064 "nvme_io": false, 00:19:12.064 "nvme_io_md": false, 00:19:12.064 "write_zeroes": true, 00:19:12.064 "zcopy": true, 00:19:12.064 "get_zone_info": false, 00:19:12.064 "zone_management": false, 00:19:12.064 "zone_append": false, 00:19:12.064 "compare": false, 00:19:12.064 "compare_and_write": false, 00:19:12.064 "abort": true, 00:19:12.064 "seek_hole": false, 00:19:12.064 "seek_data": false, 00:19:12.064 "copy": true, 00:19:12.064 "nvme_iov_md": false 00:19:12.064 }, 00:19:12.064 "memory_domains": [ 00:19:12.064 { 00:19:12.064 "dma_device_id": "system", 00:19:12.064 "dma_device_type": 1 00:19:12.064 }, 00:19:12.064 { 00:19:12.064 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.064 "dma_device_type": 2 00:19:12.064 } 00:19:12.064 ], 00:19:12.064 "driver_specific": { 00:19:12.064 "passthru": { 00:19:12.064 "name": "pt3", 00:19:12.064 "base_bdev_name": "malloc3" 00:19:12.064 } 00:19:12.064 } 00:19:12.064 }' 00:19:12.064 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.064 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:12.324 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:12.584 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:12.584 "name": "pt4", 00:19:12.584 "aliases": [ 00:19:12.584 "00000000-0000-0000-0000-000000000004" 00:19:12.584 ], 00:19:12.584 "product_name": "passthru", 00:19:12.584 "block_size": 512, 00:19:12.584 "num_blocks": 65536, 00:19:12.584 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:12.584 "assigned_rate_limits": { 00:19:12.584 "rw_ios_per_sec": 0, 00:19:12.584 "rw_mbytes_per_sec": 0, 00:19:12.584 "r_mbytes_per_sec": 0, 00:19:12.584 "w_mbytes_per_sec": 0 00:19:12.584 }, 00:19:12.584 "claimed": true, 00:19:12.584 "claim_type": "exclusive_write", 00:19:12.584 "zoned": false, 00:19:12.584 "supported_io_types": { 00:19:12.584 "read": true, 00:19:12.584 "write": true, 00:19:12.584 "unmap": true, 00:19:12.584 "flush": true, 00:19:12.584 "reset": true, 00:19:12.584 "nvme_admin": false, 00:19:12.584 "nvme_io": false, 00:19:12.584 "nvme_io_md": false, 00:19:12.584 "write_zeroes": true, 00:19:12.584 "zcopy": true, 00:19:12.584 "get_zone_info": false, 00:19:12.584 "zone_management": false, 00:19:12.584 "zone_append": false, 00:19:12.584 "compare": false, 00:19:12.584 "compare_and_write": false, 00:19:12.584 "abort": true, 00:19:12.584 "seek_hole": false, 00:19:12.584 "seek_data": false, 00:19:12.584 "copy": true, 00:19:12.584 "nvme_iov_md": false 00:19:12.584 }, 00:19:12.584 "memory_domains": [ 00:19:12.584 { 00:19:12.584 "dma_device_id": "system", 00:19:12.584 "dma_device_type": 1 00:19:12.584 }, 00:19:12.584 { 00:19:12.584 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.584 "dma_device_type": 2 00:19:12.584 } 00:19:12.584 ], 00:19:12.584 "driver_specific": { 00:19:12.584 "passthru": { 00:19:12.584 "name": "pt4", 00:19:12.584 "base_bdev_name": "malloc4" 00:19:12.584 } 00:19:12.584 } 00:19:12.584 }' 00:19:12.584 15:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.584 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:12.584 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:12.584 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:12.852 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:13.151 [2024-07-12 15:55:33.439287] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' e00c030b-1921-4dd5-8da6-effc40918467 '!=' e00c030b-1921-4dd5-8da6-effc40918467 ']' 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2578951 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2578951 ']' 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2578951 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2578951 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2578951' 00:19:13.151 killing process with pid 2578951 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2578951 00:19:13.151 [2024-07-12 15:55:33.492458] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:13.151 [2024-07-12 15:55:33.492503] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:13.151 [2024-07-12 15:55:33.492555] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:13.151 [2024-07-12 15:55:33.492562] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16fa6e0 name raid_bdev1, state offline 00:19:13.151 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2578951 00:19:13.151 [2024-07-12 15:55:33.513154] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:13.411 15:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:13.411 00:19:13.411 real 0m17.221s 00:19:13.411 user 0m31.955s 00:19:13.411 sys 0m2.332s 00:19:13.411 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:13.411 15:55:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.411 ************************************ 00:19:13.411 END TEST raid_superblock_test 00:19:13.411 ************************************ 00:19:13.411 15:55:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:13.411 15:55:33 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:19:13.412 15:55:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:13.412 15:55:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:13.412 15:55:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:13.412 ************************************ 00:19:13.412 START TEST raid_read_error_test 00:19:13.412 ************************************ 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.bKQbvzPc5G 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2582307 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2582307 /var/tmp/spdk-raid.sock 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2582307 ']' 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:13.412 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:13.412 15:55:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.412 [2024-07-12 15:55:33.784509] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:19:13.412 [2024-07-12 15:55:33.784556] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2582307 ] 00:19:13.672 [2024-07-12 15:55:33.869802] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:13.672 [2024-07-12 15:55:33.933134] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:13.672 [2024-07-12 15:55:33.975820] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.672 [2024-07-12 15:55:33.975844] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:14.241 15:55:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:14.241 15:55:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:14.241 15:55:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:14.241 15:55:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:14.500 BaseBdev1_malloc 00:19:14.500 15:55:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:14.760 true 00:19:14.761 15:55:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:14.761 [2024-07-12 15:55:35.154054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:14.761 [2024-07-12 15:55:35.154086] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:14.761 [2024-07-12 15:55:35.154098] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a84aa0 00:19:14.761 [2024-07-12 15:55:35.154104] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:14.761 [2024-07-12 15:55:35.155322] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:14.761 [2024-07-12 15:55:35.155343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:14.761 BaseBdev1 00:19:14.761 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:14.761 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:15.020 BaseBdev2_malloc 00:19:15.020 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:15.280 true 00:19:15.280 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:15.280 [2024-07-12 15:55:35.721263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:15.280 [2024-07-12 15:55:35.721289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:15.280 [2024-07-12 15:55:35.721299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a89e40 00:19:15.280 [2024-07-12 15:55:35.721306] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:15.280 [2024-07-12 15:55:35.722441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:15.280 [2024-07-12 15:55:35.722459] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:15.280 BaseBdev2 00:19:15.541 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:15.541 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:15.541 BaseBdev3_malloc 00:19:15.541 15:55:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:15.800 true 00:19:15.800 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:16.060 [2024-07-12 15:55:36.272251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:16.060 [2024-07-12 15:55:36.272274] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:16.060 [2024-07-12 15:55:36.272283] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a8b7f0 00:19:16.060 [2024-07-12 15:55:36.272289] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:16.060 [2024-07-12 15:55:36.273426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:16.060 [2024-07-12 15:55:36.273444] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:16.060 BaseBdev3 00:19:16.060 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:16.060 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:16.060 BaseBdev4_malloc 00:19:16.060 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:16.320 true 00:19:16.320 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:16.580 [2024-07-12 15:55:36.815188] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:16.580 [2024-07-12 15:55:36.815212] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:16.580 [2024-07-12 15:55:36.815222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a898b0 00:19:16.581 [2024-07-12 15:55:36.815229] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:16.581 [2024-07-12 15:55:36.816363] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:16.581 [2024-07-12 15:55:36.816381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:16.581 BaseBdev4 00:19:16.581 15:55:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:16.581 [2024-07-12 15:55:36.995671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:16.581 [2024-07-12 15:55:36.996646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:16.581 [2024-07-12 15:55:36.996698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:16.581 [2024-07-12 15:55:36.996752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:16.581 [2024-07-12 15:55:36.996927] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a8d290 00:19:16.581 [2024-07-12 15:55:36.996934] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:16.581 [2024-07-12 15:55:36.997068] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a8d7e0 00:19:16.581 [2024-07-12 15:55:36.997182] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a8d290 00:19:16.581 [2024-07-12 15:55:36.997187] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a8d290 00:19:16.581 [2024-07-12 15:55:36.997260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.581 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.841 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.841 "name": "raid_bdev1", 00:19:16.841 "uuid": "de7296a5-dc9f-4e64-9360-9e2ba644a11b", 00:19:16.841 "strip_size_kb": 64, 00:19:16.841 "state": "online", 00:19:16.841 "raid_level": "raid0", 00:19:16.841 "superblock": true, 00:19:16.841 "num_base_bdevs": 4, 00:19:16.841 "num_base_bdevs_discovered": 4, 00:19:16.841 "num_base_bdevs_operational": 4, 00:19:16.841 "base_bdevs_list": [ 00:19:16.841 { 00:19:16.841 "name": "BaseBdev1", 00:19:16.841 "uuid": "60076902-8e61-5e24-8c12-3dffb91894ab", 00:19:16.841 "is_configured": true, 00:19:16.841 "data_offset": 2048, 00:19:16.841 "data_size": 63488 00:19:16.841 }, 00:19:16.841 { 00:19:16.841 "name": "BaseBdev2", 00:19:16.841 "uuid": "70430fd8-f478-5798-9269-883682cd1b79", 00:19:16.841 "is_configured": true, 00:19:16.841 "data_offset": 2048, 00:19:16.841 "data_size": 63488 00:19:16.841 }, 00:19:16.841 { 00:19:16.841 "name": "BaseBdev3", 00:19:16.841 "uuid": "ce6fe9fa-a04c-56ee-a7b8-5dc24f91e1c1", 00:19:16.841 "is_configured": true, 00:19:16.841 "data_offset": 2048, 00:19:16.841 "data_size": 63488 00:19:16.841 }, 00:19:16.841 { 00:19:16.841 "name": "BaseBdev4", 00:19:16.841 "uuid": "71a4e38a-6900-5616-920c-c3eb5d35d38e", 00:19:16.841 "is_configured": true, 00:19:16.841 "data_offset": 2048, 00:19:16.841 "data_size": 63488 00:19:16.841 } 00:19:16.841 ] 00:19:16.841 }' 00:19:16.841 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.841 15:55:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.409 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:17.409 15:55:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:17.409 [2024-07-12 15:55:37.834006] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a91b20 00:19:18.347 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.607 15:55:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:18.868 15:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.868 "name": "raid_bdev1", 00:19:18.868 "uuid": "de7296a5-dc9f-4e64-9360-9e2ba644a11b", 00:19:18.868 "strip_size_kb": 64, 00:19:18.868 "state": "online", 00:19:18.868 "raid_level": "raid0", 00:19:18.868 "superblock": true, 00:19:18.868 "num_base_bdevs": 4, 00:19:18.868 "num_base_bdevs_discovered": 4, 00:19:18.868 "num_base_bdevs_operational": 4, 00:19:18.868 "base_bdevs_list": [ 00:19:18.868 { 00:19:18.868 "name": "BaseBdev1", 00:19:18.868 "uuid": "60076902-8e61-5e24-8c12-3dffb91894ab", 00:19:18.868 "is_configured": true, 00:19:18.868 "data_offset": 2048, 00:19:18.868 "data_size": 63488 00:19:18.868 }, 00:19:18.868 { 00:19:18.868 "name": "BaseBdev2", 00:19:18.868 "uuid": "70430fd8-f478-5798-9269-883682cd1b79", 00:19:18.868 "is_configured": true, 00:19:18.868 "data_offset": 2048, 00:19:18.868 "data_size": 63488 00:19:18.868 }, 00:19:18.868 { 00:19:18.868 "name": "BaseBdev3", 00:19:18.868 "uuid": "ce6fe9fa-a04c-56ee-a7b8-5dc24f91e1c1", 00:19:18.868 "is_configured": true, 00:19:18.868 "data_offset": 2048, 00:19:18.868 "data_size": 63488 00:19:18.868 }, 00:19:18.868 { 00:19:18.868 "name": "BaseBdev4", 00:19:18.868 "uuid": "71a4e38a-6900-5616-920c-c3eb5d35d38e", 00:19:18.868 "is_configured": true, 00:19:18.868 "data_offset": 2048, 00:19:18.868 "data_size": 63488 00:19:18.868 } 00:19:18.868 ] 00:19:18.868 }' 00:19:18.868 15:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.868 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.438 15:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:19.438 [2024-07-12 15:55:39.877846] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:19.438 [2024-07-12 15:55:39.877881] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:19.438 [2024-07-12 15:55:39.880479] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:19.438 [2024-07-12 15:55:39.880507] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:19.438 [2024-07-12 15:55:39.880538] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:19.438 [2024-07-12 15:55:39.880544] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a8d290 name raid_bdev1, state offline 00:19:19.438 0 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2582307 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2582307 ']' 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2582307 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2582307 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2582307' 00:19:19.697 killing process with pid 2582307 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2582307 00:19:19.697 [2024-07-12 15:55:39.964078] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:19.697 15:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2582307 00:19:19.697 [2024-07-12 15:55:39.981145] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.bKQbvzPc5G 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:19:19.697 00:19:19.697 real 0m6.402s 00:19:19.697 user 0m10.338s 00:19:19.697 sys 0m0.862s 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:19.697 15:55:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.697 ************************************ 00:19:19.697 END TEST raid_read_error_test 00:19:19.697 ************************************ 00:19:19.958 15:55:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:19.958 15:55:40 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:19:19.958 15:55:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:19.958 15:55:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:19.958 15:55:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:19.958 ************************************ 00:19:19.958 START TEST raid_write_error_test 00:19:19.958 ************************************ 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.w9ua7ReZBM 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2583348 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2583348 /var/tmp/spdk-raid.sock 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2583348 ']' 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:19.958 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:19.958 15:55:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.958 [2024-07-12 15:55:40.265633] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:19:19.958 [2024-07-12 15:55:40.265688] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2583348 ] 00:19:19.958 [2024-07-12 15:55:40.357135] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:20.218 [2024-07-12 15:55:40.424284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:20.218 [2024-07-12 15:55:40.464726] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:20.218 [2024-07-12 15:55:40.464748] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:20.787 15:55:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:20.787 15:55:41 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:20.787 15:55:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:20.787 15:55:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:21.356 BaseBdev1_malloc 00:19:21.356 15:55:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:21.616 true 00:19:21.616 15:55:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:22.184 [2024-07-12 15:55:42.344902] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:22.184 [2024-07-12 15:55:42.344934] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:22.184 [2024-07-12 15:55:42.344946] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2467aa0 00:19:22.184 [2024-07-12 15:55:42.344953] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:22.184 [2024-07-12 15:55:42.346226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:22.184 [2024-07-12 15:55:42.346246] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:22.184 BaseBdev1 00:19:22.184 15:55:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:22.184 15:55:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:22.184 BaseBdev2_malloc 00:19:22.184 15:55:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:22.803 true 00:19:22.803 15:55:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:23.062 [2024-07-12 15:55:43.289250] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:23.062 [2024-07-12 15:55:43.289278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:23.062 [2024-07-12 15:55:43.289290] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246ce40 00:19:23.062 [2024-07-12 15:55:43.289297] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:23.062 [2024-07-12 15:55:43.290497] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:23.062 [2024-07-12 15:55:43.290516] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:23.062 BaseBdev2 00:19:23.062 15:55:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:23.062 15:55:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:23.630 BaseBdev3_malloc 00:19:23.630 15:55:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:23.630 true 00:19:23.630 15:55:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:24.197 [2024-07-12 15:55:44.546266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:24.197 [2024-07-12 15:55:44.546299] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:24.197 [2024-07-12 15:55:44.546313] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246e7f0 00:19:24.197 [2024-07-12 15:55:44.546319] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:24.197 [2024-07-12 15:55:44.547514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:24.198 [2024-07-12 15:55:44.547535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:24.198 BaseBdev3 00:19:24.198 15:55:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:24.198 15:55:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:24.457 BaseBdev4_malloc 00:19:24.457 15:55:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:25.025 true 00:19:25.025 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:25.285 [2024-07-12 15:55:45.490539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:25.285 [2024-07-12 15:55:45.490567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.285 [2024-07-12 15:55:45.490580] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246c8b0 00:19:25.285 [2024-07-12 15:55:45.490586] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.285 [2024-07-12 15:55:45.491777] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.285 [2024-07-12 15:55:45.491795] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:25.285 BaseBdev4 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:25.285 [2024-07-12 15:55:45.683049] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:25.285 [2024-07-12 15:55:45.684040] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:25.285 [2024-07-12 15:55:45.684093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:25.285 [2024-07-12 15:55:45.684140] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:25.285 [2024-07-12 15:55:45.684315] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2470290 00:19:25.285 [2024-07-12 15:55:45.684323] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:25.285 [2024-07-12 15:55:45.684461] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24707e0 00:19:25.285 [2024-07-12 15:55:45.684576] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2470290 00:19:25.285 [2024-07-12 15:55:45.684582] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2470290 00:19:25.285 [2024-07-12 15:55:45.684654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.285 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:25.545 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.545 "name": "raid_bdev1", 00:19:25.545 "uuid": "7249738e-16bc-4a9c-b83c-94baa34ead75", 00:19:25.545 "strip_size_kb": 64, 00:19:25.545 "state": "online", 00:19:25.545 "raid_level": "raid0", 00:19:25.545 "superblock": true, 00:19:25.545 "num_base_bdevs": 4, 00:19:25.545 "num_base_bdevs_discovered": 4, 00:19:25.545 "num_base_bdevs_operational": 4, 00:19:25.545 "base_bdevs_list": [ 00:19:25.545 { 00:19:25.545 "name": "BaseBdev1", 00:19:25.545 "uuid": "f9f03957-0465-52ba-b123-e89a95cfc16d", 00:19:25.545 "is_configured": true, 00:19:25.545 "data_offset": 2048, 00:19:25.545 "data_size": 63488 00:19:25.545 }, 00:19:25.545 { 00:19:25.545 "name": "BaseBdev2", 00:19:25.545 "uuid": "9cb8973f-0183-58b1-8644-7a3645c245a5", 00:19:25.545 "is_configured": true, 00:19:25.545 "data_offset": 2048, 00:19:25.545 "data_size": 63488 00:19:25.545 }, 00:19:25.545 { 00:19:25.545 "name": "BaseBdev3", 00:19:25.545 "uuid": "74f39194-a160-5ffa-ab38-ec6252670062", 00:19:25.545 "is_configured": true, 00:19:25.545 "data_offset": 2048, 00:19:25.545 "data_size": 63488 00:19:25.545 }, 00:19:25.545 { 00:19:25.545 "name": "BaseBdev4", 00:19:25.545 "uuid": "dbd4858c-1c68-5ec0-b70a-63e27ee6f3e9", 00:19:25.545 "is_configured": true, 00:19:25.545 "data_offset": 2048, 00:19:25.545 "data_size": 63488 00:19:25.545 } 00:19:25.545 ] 00:19:25.545 }' 00:19:25.545 15:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.545 15:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.115 15:55:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:26.115 15:55:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:26.115 [2024-07-12 15:55:46.521434] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2474b20 00:19:27.056 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.316 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:27.576 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:27.576 "name": "raid_bdev1", 00:19:27.576 "uuid": "7249738e-16bc-4a9c-b83c-94baa34ead75", 00:19:27.576 "strip_size_kb": 64, 00:19:27.576 "state": "online", 00:19:27.576 "raid_level": "raid0", 00:19:27.576 "superblock": true, 00:19:27.576 "num_base_bdevs": 4, 00:19:27.576 "num_base_bdevs_discovered": 4, 00:19:27.576 "num_base_bdevs_operational": 4, 00:19:27.576 "base_bdevs_list": [ 00:19:27.576 { 00:19:27.576 "name": "BaseBdev1", 00:19:27.576 "uuid": "f9f03957-0465-52ba-b123-e89a95cfc16d", 00:19:27.576 "is_configured": true, 00:19:27.576 "data_offset": 2048, 00:19:27.576 "data_size": 63488 00:19:27.576 }, 00:19:27.576 { 00:19:27.576 "name": "BaseBdev2", 00:19:27.576 "uuid": "9cb8973f-0183-58b1-8644-7a3645c245a5", 00:19:27.576 "is_configured": true, 00:19:27.576 "data_offset": 2048, 00:19:27.576 "data_size": 63488 00:19:27.576 }, 00:19:27.576 { 00:19:27.576 "name": "BaseBdev3", 00:19:27.576 "uuid": "74f39194-a160-5ffa-ab38-ec6252670062", 00:19:27.576 "is_configured": true, 00:19:27.576 "data_offset": 2048, 00:19:27.576 "data_size": 63488 00:19:27.576 }, 00:19:27.576 { 00:19:27.576 "name": "BaseBdev4", 00:19:27.576 "uuid": "dbd4858c-1c68-5ec0-b70a-63e27ee6f3e9", 00:19:27.576 "is_configured": true, 00:19:27.576 "data_offset": 2048, 00:19:27.576 "data_size": 63488 00:19:27.576 } 00:19:27.576 ] 00:19:27.576 }' 00:19:27.576 15:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:27.576 15:55:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:28.147 [2024-07-12 15:55:48.561417] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:28.147 [2024-07-12 15:55:48.561446] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:28.147 [2024-07-12 15:55:48.564033] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:28.147 [2024-07-12 15:55:48.564059] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:28.147 [2024-07-12 15:55:48.564090] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:28.147 [2024-07-12 15:55:48.564096] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2470290 name raid_bdev1, state offline 00:19:28.147 0 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2583348 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2583348 ']' 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2583348 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:28.147 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2583348 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2583348' 00:19:28.407 killing process with pid 2583348 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2583348 00:19:28.407 [2024-07-12 15:55:48.629023] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2583348 00:19:28.407 [2024-07-12 15:55:48.646039] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.w9ua7ReZBM 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:19:28.407 00:19:28.407 real 0m8.583s 00:19:28.407 user 0m14.446s 00:19:28.407 sys 0m1.093s 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:28.407 15:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.408 ************************************ 00:19:28.408 END TEST raid_write_error_test 00:19:28.408 ************************************ 00:19:28.408 15:55:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:28.408 15:55:48 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:28.408 15:55:48 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:28.408 15:55:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:28.408 15:55:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:28.408 15:55:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:28.668 ************************************ 00:19:28.668 START TEST raid_state_function_test 00:19:28.668 ************************************ 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2584957 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2584957' 00:19:28.668 Process raid pid: 2584957 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2584957 /var/tmp/spdk-raid.sock 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2584957 ']' 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:28.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:28.668 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:28.668 [2024-07-12 15:55:48.918519] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:19:28.668 [2024-07-12 15:55:48.918566] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:28.668 [2024-07-12 15:55:49.005863] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:28.668 [2024-07-12 15:55:49.068640] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:28.668 [2024-07-12 15:55:49.111762] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:28.668 [2024-07-12 15:55:49.111783] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:29.609 15:55:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:29.609 15:55:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:29.609 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:29.609 [2024-07-12 15:55:49.910787] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:29.609 [2024-07-12 15:55:49.910813] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:29.610 [2024-07-12 15:55:49.910819] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:29.610 [2024-07-12 15:55:49.910825] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:29.610 [2024-07-12 15:55:49.910830] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:29.610 [2024-07-12 15:55:49.910835] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:29.610 [2024-07-12 15:55:49.910840] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:29.610 [2024-07-12 15:55:49.910845] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:29.610 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:29.870 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:29.870 "name": "Existed_Raid", 00:19:29.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.870 "strip_size_kb": 64, 00:19:29.870 "state": "configuring", 00:19:29.870 "raid_level": "concat", 00:19:29.870 "superblock": false, 00:19:29.870 "num_base_bdevs": 4, 00:19:29.870 "num_base_bdevs_discovered": 0, 00:19:29.870 "num_base_bdevs_operational": 4, 00:19:29.870 "base_bdevs_list": [ 00:19:29.870 { 00:19:29.870 "name": "BaseBdev1", 00:19:29.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.870 "is_configured": false, 00:19:29.870 "data_offset": 0, 00:19:29.870 "data_size": 0 00:19:29.870 }, 00:19:29.870 { 00:19:29.870 "name": "BaseBdev2", 00:19:29.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.870 "is_configured": false, 00:19:29.870 "data_offset": 0, 00:19:29.870 "data_size": 0 00:19:29.870 }, 00:19:29.870 { 00:19:29.870 "name": "BaseBdev3", 00:19:29.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.870 "is_configured": false, 00:19:29.870 "data_offset": 0, 00:19:29.870 "data_size": 0 00:19:29.870 }, 00:19:29.870 { 00:19:29.870 "name": "BaseBdev4", 00:19:29.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:29.870 "is_configured": false, 00:19:29.870 "data_offset": 0, 00:19:29.870 "data_size": 0 00:19:29.870 } 00:19:29.870 ] 00:19:29.870 }' 00:19:29.870 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:29.870 15:55:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.456 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:30.456 [2024-07-12 15:55:50.800936] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:30.456 [2024-07-12 15:55:50.800955] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cb920 name Existed_Raid, state configuring 00:19:30.456 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:30.728 [2024-07-12 15:55:50.997449] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:30.728 [2024-07-12 15:55:50.997467] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:30.728 [2024-07-12 15:55:50.997473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:30.728 [2024-07-12 15:55:50.997478] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:30.728 [2024-07-12 15:55:50.997483] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:30.728 [2024-07-12 15:55:50.997488] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:30.728 [2024-07-12 15:55:50.997492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:30.728 [2024-07-12 15:55:50.997498] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:30.728 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:30.988 [2024-07-12 15:55:51.188356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:30.988 BaseBdev1 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:30.988 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:31.249 [ 00:19:31.249 { 00:19:31.249 "name": "BaseBdev1", 00:19:31.249 "aliases": [ 00:19:31.249 "2973075d-529a-4491-bdf7-de21af49edb4" 00:19:31.249 ], 00:19:31.249 "product_name": "Malloc disk", 00:19:31.249 "block_size": 512, 00:19:31.249 "num_blocks": 65536, 00:19:31.249 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:31.249 "assigned_rate_limits": { 00:19:31.249 "rw_ios_per_sec": 0, 00:19:31.249 "rw_mbytes_per_sec": 0, 00:19:31.249 "r_mbytes_per_sec": 0, 00:19:31.249 "w_mbytes_per_sec": 0 00:19:31.249 }, 00:19:31.249 "claimed": true, 00:19:31.249 "claim_type": "exclusive_write", 00:19:31.249 "zoned": false, 00:19:31.249 "supported_io_types": { 00:19:31.249 "read": true, 00:19:31.249 "write": true, 00:19:31.249 "unmap": true, 00:19:31.249 "flush": true, 00:19:31.249 "reset": true, 00:19:31.249 "nvme_admin": false, 00:19:31.249 "nvme_io": false, 00:19:31.249 "nvme_io_md": false, 00:19:31.249 "write_zeroes": true, 00:19:31.249 "zcopy": true, 00:19:31.249 "get_zone_info": false, 00:19:31.249 "zone_management": false, 00:19:31.249 "zone_append": false, 00:19:31.249 "compare": false, 00:19:31.249 "compare_and_write": false, 00:19:31.249 "abort": true, 00:19:31.249 "seek_hole": false, 00:19:31.249 "seek_data": false, 00:19:31.249 "copy": true, 00:19:31.249 "nvme_iov_md": false 00:19:31.249 }, 00:19:31.249 "memory_domains": [ 00:19:31.249 { 00:19:31.249 "dma_device_id": "system", 00:19:31.249 "dma_device_type": 1 00:19:31.249 }, 00:19:31.249 { 00:19:31.249 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.249 "dma_device_type": 2 00:19:31.249 } 00:19:31.249 ], 00:19:31.249 "driver_specific": {} 00:19:31.249 } 00:19:31.249 ] 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.249 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.510 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.510 "name": "Existed_Raid", 00:19:31.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.510 "strip_size_kb": 64, 00:19:31.510 "state": "configuring", 00:19:31.510 "raid_level": "concat", 00:19:31.510 "superblock": false, 00:19:31.510 "num_base_bdevs": 4, 00:19:31.510 "num_base_bdevs_discovered": 1, 00:19:31.510 "num_base_bdevs_operational": 4, 00:19:31.510 "base_bdevs_list": [ 00:19:31.510 { 00:19:31.510 "name": "BaseBdev1", 00:19:31.510 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:31.510 "is_configured": true, 00:19:31.510 "data_offset": 0, 00:19:31.510 "data_size": 65536 00:19:31.510 }, 00:19:31.510 { 00:19:31.510 "name": "BaseBdev2", 00:19:31.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.510 "is_configured": false, 00:19:31.510 "data_offset": 0, 00:19:31.510 "data_size": 0 00:19:31.510 }, 00:19:31.510 { 00:19:31.510 "name": "BaseBdev3", 00:19:31.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.510 "is_configured": false, 00:19:31.510 "data_offset": 0, 00:19:31.510 "data_size": 0 00:19:31.510 }, 00:19:31.510 { 00:19:31.510 "name": "BaseBdev4", 00:19:31.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.510 "is_configured": false, 00:19:31.510 "data_offset": 0, 00:19:31.510 "data_size": 0 00:19:31.510 } 00:19:31.510 ] 00:19:31.510 }' 00:19:31.510 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.510 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.078 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:32.078 [2024-07-12 15:55:52.495666] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:32.078 [2024-07-12 15:55:52.495692] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cb190 name Existed_Raid, state configuring 00:19:32.078 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:32.338 [2024-07-12 15:55:52.688186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:32.338 [2024-07-12 15:55:52.689298] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:32.338 [2024-07-12 15:55:52.689320] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:32.338 [2024-07-12 15:55:52.689326] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:32.338 [2024-07-12 15:55:52.689332] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:32.338 [2024-07-12 15:55:52.689337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:32.338 [2024-07-12 15:55:52.689343] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.338 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.597 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.597 "name": "Existed_Raid", 00:19:32.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.597 "strip_size_kb": 64, 00:19:32.597 "state": "configuring", 00:19:32.597 "raid_level": "concat", 00:19:32.597 "superblock": false, 00:19:32.597 "num_base_bdevs": 4, 00:19:32.597 "num_base_bdevs_discovered": 1, 00:19:32.597 "num_base_bdevs_operational": 4, 00:19:32.597 "base_bdevs_list": [ 00:19:32.597 { 00:19:32.597 "name": "BaseBdev1", 00:19:32.597 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:32.597 "is_configured": true, 00:19:32.597 "data_offset": 0, 00:19:32.597 "data_size": 65536 00:19:32.597 }, 00:19:32.597 { 00:19:32.597 "name": "BaseBdev2", 00:19:32.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.597 "is_configured": false, 00:19:32.597 "data_offset": 0, 00:19:32.597 "data_size": 0 00:19:32.597 }, 00:19:32.597 { 00:19:32.597 "name": "BaseBdev3", 00:19:32.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.597 "is_configured": false, 00:19:32.597 "data_offset": 0, 00:19:32.597 "data_size": 0 00:19:32.597 }, 00:19:32.597 { 00:19:32.597 "name": "BaseBdev4", 00:19:32.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.597 "is_configured": false, 00:19:32.597 "data_offset": 0, 00:19:32.597 "data_size": 0 00:19:32.597 } 00:19:32.597 ] 00:19:32.597 }' 00:19:32.597 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.597 15:55:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:33.166 [2024-07-12 15:55:53.591415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:33.166 BaseBdev2 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:33.166 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.426 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:33.686 [ 00:19:33.686 { 00:19:33.686 "name": "BaseBdev2", 00:19:33.686 "aliases": [ 00:19:33.686 "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc" 00:19:33.686 ], 00:19:33.686 "product_name": "Malloc disk", 00:19:33.686 "block_size": 512, 00:19:33.686 "num_blocks": 65536, 00:19:33.686 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:33.686 "assigned_rate_limits": { 00:19:33.686 "rw_ios_per_sec": 0, 00:19:33.686 "rw_mbytes_per_sec": 0, 00:19:33.686 "r_mbytes_per_sec": 0, 00:19:33.686 "w_mbytes_per_sec": 0 00:19:33.686 }, 00:19:33.686 "claimed": true, 00:19:33.686 "claim_type": "exclusive_write", 00:19:33.686 "zoned": false, 00:19:33.686 "supported_io_types": { 00:19:33.686 "read": true, 00:19:33.686 "write": true, 00:19:33.686 "unmap": true, 00:19:33.686 "flush": true, 00:19:33.686 "reset": true, 00:19:33.686 "nvme_admin": false, 00:19:33.686 "nvme_io": false, 00:19:33.686 "nvme_io_md": false, 00:19:33.686 "write_zeroes": true, 00:19:33.686 "zcopy": true, 00:19:33.686 "get_zone_info": false, 00:19:33.686 "zone_management": false, 00:19:33.686 "zone_append": false, 00:19:33.686 "compare": false, 00:19:33.686 "compare_and_write": false, 00:19:33.686 "abort": true, 00:19:33.686 "seek_hole": false, 00:19:33.686 "seek_data": false, 00:19:33.686 "copy": true, 00:19:33.686 "nvme_iov_md": false 00:19:33.686 }, 00:19:33.686 "memory_domains": [ 00:19:33.686 { 00:19:33.686 "dma_device_id": "system", 00:19:33.686 "dma_device_type": 1 00:19:33.686 }, 00:19:33.686 { 00:19:33.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:33.686 "dma_device_type": 2 00:19:33.686 } 00:19:33.686 ], 00:19:33.686 "driver_specific": {} 00:19:33.686 } 00:19:33.686 ] 00:19:33.686 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:33.686 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:33.686 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:33.686 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:33.686 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.686 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.947 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.947 "name": "Existed_Raid", 00:19:33.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.947 "strip_size_kb": 64, 00:19:33.947 "state": "configuring", 00:19:33.947 "raid_level": "concat", 00:19:33.947 "superblock": false, 00:19:33.947 "num_base_bdevs": 4, 00:19:33.947 "num_base_bdevs_discovered": 2, 00:19:33.947 "num_base_bdevs_operational": 4, 00:19:33.947 "base_bdevs_list": [ 00:19:33.947 { 00:19:33.947 "name": "BaseBdev1", 00:19:33.947 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:33.947 "is_configured": true, 00:19:33.947 "data_offset": 0, 00:19:33.947 "data_size": 65536 00:19:33.947 }, 00:19:33.947 { 00:19:33.947 "name": "BaseBdev2", 00:19:33.947 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:33.947 "is_configured": true, 00:19:33.947 "data_offset": 0, 00:19:33.947 "data_size": 65536 00:19:33.947 }, 00:19:33.947 { 00:19:33.947 "name": "BaseBdev3", 00:19:33.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.947 "is_configured": false, 00:19:33.947 "data_offset": 0, 00:19:33.947 "data_size": 0 00:19:33.947 }, 00:19:33.947 { 00:19:33.947 "name": "BaseBdev4", 00:19:33.947 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.947 "is_configured": false, 00:19:33.947 "data_offset": 0, 00:19:33.947 "data_size": 0 00:19:33.947 } 00:19:33.947 ] 00:19:33.947 }' 00:19:33.947 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.947 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:34.517 [2024-07-12 15:55:54.927702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:34.517 BaseBdev3 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:34.517 15:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:34.778 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:35.038 [ 00:19:35.038 { 00:19:35.038 "name": "BaseBdev3", 00:19:35.038 "aliases": [ 00:19:35.038 "3e63ac6b-de76-4747-96b7-fe6747c25ab2" 00:19:35.038 ], 00:19:35.038 "product_name": "Malloc disk", 00:19:35.038 "block_size": 512, 00:19:35.038 "num_blocks": 65536, 00:19:35.038 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:35.038 "assigned_rate_limits": { 00:19:35.038 "rw_ios_per_sec": 0, 00:19:35.038 "rw_mbytes_per_sec": 0, 00:19:35.038 "r_mbytes_per_sec": 0, 00:19:35.038 "w_mbytes_per_sec": 0 00:19:35.038 }, 00:19:35.038 "claimed": true, 00:19:35.038 "claim_type": "exclusive_write", 00:19:35.038 "zoned": false, 00:19:35.038 "supported_io_types": { 00:19:35.038 "read": true, 00:19:35.038 "write": true, 00:19:35.038 "unmap": true, 00:19:35.038 "flush": true, 00:19:35.038 "reset": true, 00:19:35.038 "nvme_admin": false, 00:19:35.038 "nvme_io": false, 00:19:35.038 "nvme_io_md": false, 00:19:35.038 "write_zeroes": true, 00:19:35.038 "zcopy": true, 00:19:35.038 "get_zone_info": false, 00:19:35.038 "zone_management": false, 00:19:35.038 "zone_append": false, 00:19:35.038 "compare": false, 00:19:35.038 "compare_and_write": false, 00:19:35.038 "abort": true, 00:19:35.038 "seek_hole": false, 00:19:35.038 "seek_data": false, 00:19:35.038 "copy": true, 00:19:35.038 "nvme_iov_md": false 00:19:35.038 }, 00:19:35.038 "memory_domains": [ 00:19:35.038 { 00:19:35.038 "dma_device_id": "system", 00:19:35.038 "dma_device_type": 1 00:19:35.038 }, 00:19:35.038 { 00:19:35.038 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.038 "dma_device_type": 2 00:19:35.038 } 00:19:35.038 ], 00:19:35.038 "driver_specific": {} 00:19:35.038 } 00:19:35.038 ] 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.038 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.299 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.299 "name": "Existed_Raid", 00:19:35.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.299 "strip_size_kb": 64, 00:19:35.299 "state": "configuring", 00:19:35.299 "raid_level": "concat", 00:19:35.299 "superblock": false, 00:19:35.299 "num_base_bdevs": 4, 00:19:35.299 "num_base_bdevs_discovered": 3, 00:19:35.299 "num_base_bdevs_operational": 4, 00:19:35.299 "base_bdevs_list": [ 00:19:35.299 { 00:19:35.299 "name": "BaseBdev1", 00:19:35.299 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:35.299 "is_configured": true, 00:19:35.299 "data_offset": 0, 00:19:35.299 "data_size": 65536 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": "BaseBdev2", 00:19:35.299 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:35.299 "is_configured": true, 00:19:35.299 "data_offset": 0, 00:19:35.299 "data_size": 65536 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": "BaseBdev3", 00:19:35.299 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:35.299 "is_configured": true, 00:19:35.299 "data_offset": 0, 00:19:35.299 "data_size": 65536 00:19:35.299 }, 00:19:35.299 { 00:19:35.299 "name": "BaseBdev4", 00:19:35.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.299 "is_configured": false, 00:19:35.299 "data_offset": 0, 00:19:35.299 "data_size": 0 00:19:35.299 } 00:19:35.299 ] 00:19:35.299 }' 00:19:35.299 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.299 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:35.876 [2024-07-12 15:55:56.191917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:35.876 [2024-07-12 15:55:56.191942] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14cc1d0 00:19:35.876 [2024-07-12 15:55:56.191946] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:35.876 [2024-07-12 15:55:56.192126] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14cd220 00:19:35.876 [2024-07-12 15:55:56.192219] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14cc1d0 00:19:35.876 [2024-07-12 15:55:56.192224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14cc1d0 00:19:35.876 [2024-07-12 15:55:56.192342] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:35.876 BaseBdev4 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:35.876 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:36.137 [ 00:19:36.137 { 00:19:36.137 "name": "BaseBdev4", 00:19:36.137 "aliases": [ 00:19:36.137 "f69c2188-e01a-46c7-9a86-3489bad1f01a" 00:19:36.137 ], 00:19:36.137 "product_name": "Malloc disk", 00:19:36.137 "block_size": 512, 00:19:36.137 "num_blocks": 65536, 00:19:36.137 "uuid": "f69c2188-e01a-46c7-9a86-3489bad1f01a", 00:19:36.137 "assigned_rate_limits": { 00:19:36.137 "rw_ios_per_sec": 0, 00:19:36.137 "rw_mbytes_per_sec": 0, 00:19:36.137 "r_mbytes_per_sec": 0, 00:19:36.137 "w_mbytes_per_sec": 0 00:19:36.137 }, 00:19:36.137 "claimed": true, 00:19:36.137 "claim_type": "exclusive_write", 00:19:36.137 "zoned": false, 00:19:36.137 "supported_io_types": { 00:19:36.137 "read": true, 00:19:36.137 "write": true, 00:19:36.137 "unmap": true, 00:19:36.137 "flush": true, 00:19:36.137 "reset": true, 00:19:36.137 "nvme_admin": false, 00:19:36.137 "nvme_io": false, 00:19:36.137 "nvme_io_md": false, 00:19:36.137 "write_zeroes": true, 00:19:36.137 "zcopy": true, 00:19:36.137 "get_zone_info": false, 00:19:36.137 "zone_management": false, 00:19:36.137 "zone_append": false, 00:19:36.137 "compare": false, 00:19:36.137 "compare_and_write": false, 00:19:36.137 "abort": true, 00:19:36.137 "seek_hole": false, 00:19:36.137 "seek_data": false, 00:19:36.137 "copy": true, 00:19:36.137 "nvme_iov_md": false 00:19:36.137 }, 00:19:36.137 "memory_domains": [ 00:19:36.137 { 00:19:36.137 "dma_device_id": "system", 00:19:36.137 "dma_device_type": 1 00:19:36.137 }, 00:19:36.137 { 00:19:36.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:36.137 "dma_device_type": 2 00:19:36.137 } 00:19:36.137 ], 00:19:36.137 "driver_specific": {} 00:19:36.137 } 00:19:36.137 ] 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.137 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.397 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.397 "name": "Existed_Raid", 00:19:36.397 "uuid": "8cbb9e57-5424-4396-a822-89dbcb5926cb", 00:19:36.397 "strip_size_kb": 64, 00:19:36.397 "state": "online", 00:19:36.397 "raid_level": "concat", 00:19:36.397 "superblock": false, 00:19:36.397 "num_base_bdevs": 4, 00:19:36.397 "num_base_bdevs_discovered": 4, 00:19:36.397 "num_base_bdevs_operational": 4, 00:19:36.397 "base_bdevs_list": [ 00:19:36.397 { 00:19:36.397 "name": "BaseBdev1", 00:19:36.397 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:36.397 "is_configured": true, 00:19:36.397 "data_offset": 0, 00:19:36.397 "data_size": 65536 00:19:36.397 }, 00:19:36.397 { 00:19:36.397 "name": "BaseBdev2", 00:19:36.397 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:36.397 "is_configured": true, 00:19:36.397 "data_offset": 0, 00:19:36.397 "data_size": 65536 00:19:36.397 }, 00:19:36.397 { 00:19:36.397 "name": "BaseBdev3", 00:19:36.397 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:36.397 "is_configured": true, 00:19:36.397 "data_offset": 0, 00:19:36.397 "data_size": 65536 00:19:36.397 }, 00:19:36.397 { 00:19:36.397 "name": "BaseBdev4", 00:19:36.397 "uuid": "f69c2188-e01a-46c7-9a86-3489bad1f01a", 00:19:36.397 "is_configured": true, 00:19:36.397 "data_offset": 0, 00:19:36.397 "data_size": 65536 00:19:36.397 } 00:19:36.397 ] 00:19:36.397 }' 00:19:36.397 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.397 15:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:36.967 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:37.227 [2024-07-12 15:55:57.415356] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:37.227 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:37.227 "name": "Existed_Raid", 00:19:37.227 "aliases": [ 00:19:37.227 "8cbb9e57-5424-4396-a822-89dbcb5926cb" 00:19:37.227 ], 00:19:37.227 "product_name": "Raid Volume", 00:19:37.227 "block_size": 512, 00:19:37.227 "num_blocks": 262144, 00:19:37.227 "uuid": "8cbb9e57-5424-4396-a822-89dbcb5926cb", 00:19:37.227 "assigned_rate_limits": { 00:19:37.227 "rw_ios_per_sec": 0, 00:19:37.227 "rw_mbytes_per_sec": 0, 00:19:37.227 "r_mbytes_per_sec": 0, 00:19:37.227 "w_mbytes_per_sec": 0 00:19:37.227 }, 00:19:37.227 "claimed": false, 00:19:37.227 "zoned": false, 00:19:37.227 "supported_io_types": { 00:19:37.227 "read": true, 00:19:37.227 "write": true, 00:19:37.227 "unmap": true, 00:19:37.227 "flush": true, 00:19:37.227 "reset": true, 00:19:37.227 "nvme_admin": false, 00:19:37.227 "nvme_io": false, 00:19:37.227 "nvme_io_md": false, 00:19:37.227 "write_zeroes": true, 00:19:37.227 "zcopy": false, 00:19:37.227 "get_zone_info": false, 00:19:37.227 "zone_management": false, 00:19:37.227 "zone_append": false, 00:19:37.227 "compare": false, 00:19:37.227 "compare_and_write": false, 00:19:37.227 "abort": false, 00:19:37.227 "seek_hole": false, 00:19:37.227 "seek_data": false, 00:19:37.227 "copy": false, 00:19:37.227 "nvme_iov_md": false 00:19:37.227 }, 00:19:37.227 "memory_domains": [ 00:19:37.227 { 00:19:37.227 "dma_device_id": "system", 00:19:37.227 "dma_device_type": 1 00:19:37.227 }, 00:19:37.227 { 00:19:37.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.227 "dma_device_type": 2 00:19:37.227 }, 00:19:37.227 { 00:19:37.227 "dma_device_id": "system", 00:19:37.227 "dma_device_type": 1 00:19:37.227 }, 00:19:37.227 { 00:19:37.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.227 "dma_device_type": 2 00:19:37.227 }, 00:19:37.227 { 00:19:37.227 "dma_device_id": "system", 00:19:37.227 "dma_device_type": 1 00:19:37.227 }, 00:19:37.227 { 00:19:37.227 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.228 "dma_device_type": 2 00:19:37.228 }, 00:19:37.228 { 00:19:37.228 "dma_device_id": "system", 00:19:37.228 "dma_device_type": 1 00:19:37.228 }, 00:19:37.228 { 00:19:37.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.228 "dma_device_type": 2 00:19:37.228 } 00:19:37.228 ], 00:19:37.228 "driver_specific": { 00:19:37.228 "raid": { 00:19:37.228 "uuid": "8cbb9e57-5424-4396-a822-89dbcb5926cb", 00:19:37.228 "strip_size_kb": 64, 00:19:37.228 "state": "online", 00:19:37.228 "raid_level": "concat", 00:19:37.228 "superblock": false, 00:19:37.228 "num_base_bdevs": 4, 00:19:37.228 "num_base_bdevs_discovered": 4, 00:19:37.228 "num_base_bdevs_operational": 4, 00:19:37.228 "base_bdevs_list": [ 00:19:37.228 { 00:19:37.228 "name": "BaseBdev1", 00:19:37.228 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:37.228 "is_configured": true, 00:19:37.228 "data_offset": 0, 00:19:37.228 "data_size": 65536 00:19:37.228 }, 00:19:37.228 { 00:19:37.228 "name": "BaseBdev2", 00:19:37.228 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:37.228 "is_configured": true, 00:19:37.228 "data_offset": 0, 00:19:37.228 "data_size": 65536 00:19:37.228 }, 00:19:37.228 { 00:19:37.228 "name": "BaseBdev3", 00:19:37.228 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:37.228 "is_configured": true, 00:19:37.228 "data_offset": 0, 00:19:37.228 "data_size": 65536 00:19:37.228 }, 00:19:37.228 { 00:19:37.228 "name": "BaseBdev4", 00:19:37.228 "uuid": "f69c2188-e01a-46c7-9a86-3489bad1f01a", 00:19:37.228 "is_configured": true, 00:19:37.228 "data_offset": 0, 00:19:37.228 "data_size": 65536 00:19:37.228 } 00:19:37.228 ] 00:19:37.228 } 00:19:37.228 } 00:19:37.228 }' 00:19:37.228 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:37.228 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:37.228 BaseBdev2 00:19:37.228 BaseBdev3 00:19:37.228 BaseBdev4' 00:19:37.228 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.228 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:37.228 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.488 "name": "BaseBdev1", 00:19:37.488 "aliases": [ 00:19:37.488 "2973075d-529a-4491-bdf7-de21af49edb4" 00:19:37.488 ], 00:19:37.488 "product_name": "Malloc disk", 00:19:37.488 "block_size": 512, 00:19:37.488 "num_blocks": 65536, 00:19:37.488 "uuid": "2973075d-529a-4491-bdf7-de21af49edb4", 00:19:37.488 "assigned_rate_limits": { 00:19:37.488 "rw_ios_per_sec": 0, 00:19:37.488 "rw_mbytes_per_sec": 0, 00:19:37.488 "r_mbytes_per_sec": 0, 00:19:37.488 "w_mbytes_per_sec": 0 00:19:37.488 }, 00:19:37.488 "claimed": true, 00:19:37.488 "claim_type": "exclusive_write", 00:19:37.488 "zoned": false, 00:19:37.488 "supported_io_types": { 00:19:37.488 "read": true, 00:19:37.488 "write": true, 00:19:37.488 "unmap": true, 00:19:37.488 "flush": true, 00:19:37.488 "reset": true, 00:19:37.488 "nvme_admin": false, 00:19:37.488 "nvme_io": false, 00:19:37.488 "nvme_io_md": false, 00:19:37.488 "write_zeroes": true, 00:19:37.488 "zcopy": true, 00:19:37.488 "get_zone_info": false, 00:19:37.488 "zone_management": false, 00:19:37.488 "zone_append": false, 00:19:37.488 "compare": false, 00:19:37.488 "compare_and_write": false, 00:19:37.488 "abort": true, 00:19:37.488 "seek_hole": false, 00:19:37.488 "seek_data": false, 00:19:37.488 "copy": true, 00:19:37.488 "nvme_iov_md": false 00:19:37.488 }, 00:19:37.488 "memory_domains": [ 00:19:37.488 { 00:19:37.488 "dma_device_id": "system", 00:19:37.488 "dma_device_type": 1 00:19:37.488 }, 00:19:37.488 { 00:19:37.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.488 "dma_device_type": 2 00:19:37.488 } 00:19:37.488 ], 00:19:37.488 "driver_specific": {} 00:19:37.488 }' 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:37.488 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.748 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:37.748 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:37.748 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:37.748 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:37.748 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:37.748 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:37.748 "name": "BaseBdev2", 00:19:37.748 "aliases": [ 00:19:37.748 "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc" 00:19:37.748 ], 00:19:37.749 "product_name": "Malloc disk", 00:19:37.749 "block_size": 512, 00:19:37.749 "num_blocks": 65536, 00:19:37.749 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:37.749 "assigned_rate_limits": { 00:19:37.749 "rw_ios_per_sec": 0, 00:19:37.749 "rw_mbytes_per_sec": 0, 00:19:37.749 "r_mbytes_per_sec": 0, 00:19:37.749 "w_mbytes_per_sec": 0 00:19:37.749 }, 00:19:37.749 "claimed": true, 00:19:37.749 "claim_type": "exclusive_write", 00:19:37.749 "zoned": false, 00:19:37.749 "supported_io_types": { 00:19:37.749 "read": true, 00:19:37.749 "write": true, 00:19:37.749 "unmap": true, 00:19:37.749 "flush": true, 00:19:37.749 "reset": true, 00:19:37.749 "nvme_admin": false, 00:19:37.749 "nvme_io": false, 00:19:37.749 "nvme_io_md": false, 00:19:37.749 "write_zeroes": true, 00:19:37.749 "zcopy": true, 00:19:37.749 "get_zone_info": false, 00:19:37.749 "zone_management": false, 00:19:37.749 "zone_append": false, 00:19:37.749 "compare": false, 00:19:37.749 "compare_and_write": false, 00:19:37.749 "abort": true, 00:19:37.749 "seek_hole": false, 00:19:37.749 "seek_data": false, 00:19:37.749 "copy": true, 00:19:37.749 "nvme_iov_md": false 00:19:37.749 }, 00:19:37.749 "memory_domains": [ 00:19:37.749 { 00:19:37.749 "dma_device_id": "system", 00:19:37.749 "dma_device_type": 1 00:19:37.749 }, 00:19:37.749 { 00:19:37.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.749 "dma_device_type": 2 00:19:37.749 } 00:19:37.749 ], 00:19:37.749 "driver_specific": {} 00:19:37.749 }' 00:19:37.749 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.009 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:38.269 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:38.528 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:38.528 "name": "BaseBdev3", 00:19:38.528 "aliases": [ 00:19:38.528 "3e63ac6b-de76-4747-96b7-fe6747c25ab2" 00:19:38.528 ], 00:19:38.528 "product_name": "Malloc disk", 00:19:38.528 "block_size": 512, 00:19:38.529 "num_blocks": 65536, 00:19:38.529 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:38.529 "assigned_rate_limits": { 00:19:38.529 "rw_ios_per_sec": 0, 00:19:38.529 "rw_mbytes_per_sec": 0, 00:19:38.529 "r_mbytes_per_sec": 0, 00:19:38.529 "w_mbytes_per_sec": 0 00:19:38.529 }, 00:19:38.529 "claimed": true, 00:19:38.529 "claim_type": "exclusive_write", 00:19:38.529 "zoned": false, 00:19:38.529 "supported_io_types": { 00:19:38.529 "read": true, 00:19:38.529 "write": true, 00:19:38.529 "unmap": true, 00:19:38.529 "flush": true, 00:19:38.529 "reset": true, 00:19:38.529 "nvme_admin": false, 00:19:38.529 "nvme_io": false, 00:19:38.529 "nvme_io_md": false, 00:19:38.529 "write_zeroes": true, 00:19:38.529 "zcopy": true, 00:19:38.529 "get_zone_info": false, 00:19:38.529 "zone_management": false, 00:19:38.529 "zone_append": false, 00:19:38.529 "compare": false, 00:19:38.529 "compare_and_write": false, 00:19:38.529 "abort": true, 00:19:38.529 "seek_hole": false, 00:19:38.529 "seek_data": false, 00:19:38.529 "copy": true, 00:19:38.529 "nvme_iov_md": false 00:19:38.529 }, 00:19:38.529 "memory_domains": [ 00:19:38.529 { 00:19:38.529 "dma_device_id": "system", 00:19:38.529 "dma_device_type": 1 00:19:38.529 }, 00:19:38.529 { 00:19:38.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:38.529 "dma_device_type": 2 00:19:38.529 } 00:19:38.529 ], 00:19:38.529 "driver_specific": {} 00:19:38.529 }' 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.529 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:38.789 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:39.049 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:39.049 "name": "BaseBdev4", 00:19:39.049 "aliases": [ 00:19:39.049 "f69c2188-e01a-46c7-9a86-3489bad1f01a" 00:19:39.049 ], 00:19:39.049 "product_name": "Malloc disk", 00:19:39.049 "block_size": 512, 00:19:39.049 "num_blocks": 65536, 00:19:39.049 "uuid": "f69c2188-e01a-46c7-9a86-3489bad1f01a", 00:19:39.049 "assigned_rate_limits": { 00:19:39.049 "rw_ios_per_sec": 0, 00:19:39.049 "rw_mbytes_per_sec": 0, 00:19:39.049 "r_mbytes_per_sec": 0, 00:19:39.049 "w_mbytes_per_sec": 0 00:19:39.049 }, 00:19:39.049 "claimed": true, 00:19:39.049 "claim_type": "exclusive_write", 00:19:39.049 "zoned": false, 00:19:39.049 "supported_io_types": { 00:19:39.049 "read": true, 00:19:39.049 "write": true, 00:19:39.049 "unmap": true, 00:19:39.049 "flush": true, 00:19:39.049 "reset": true, 00:19:39.049 "nvme_admin": false, 00:19:39.049 "nvme_io": false, 00:19:39.049 "nvme_io_md": false, 00:19:39.049 "write_zeroes": true, 00:19:39.049 "zcopy": true, 00:19:39.049 "get_zone_info": false, 00:19:39.049 "zone_management": false, 00:19:39.049 "zone_append": false, 00:19:39.049 "compare": false, 00:19:39.050 "compare_and_write": false, 00:19:39.050 "abort": true, 00:19:39.050 "seek_hole": false, 00:19:39.050 "seek_data": false, 00:19:39.050 "copy": true, 00:19:39.050 "nvme_iov_md": false 00:19:39.050 }, 00:19:39.050 "memory_domains": [ 00:19:39.050 { 00:19:39.050 "dma_device_id": "system", 00:19:39.050 "dma_device_type": 1 00:19:39.050 }, 00:19:39.050 { 00:19:39.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.050 "dma_device_type": 2 00:19:39.050 } 00:19:39.050 ], 00:19:39.050 "driver_specific": {} 00:19:39.050 }' 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:39.050 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:39.310 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:39.570 [2024-07-12 15:55:59.821246] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:39.570 [2024-07-12 15:55:59.821265] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:39.570 [2024-07-12 15:55:59.821301] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.570 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.830 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.830 "name": "Existed_Raid", 00:19:39.830 "uuid": "8cbb9e57-5424-4396-a822-89dbcb5926cb", 00:19:39.830 "strip_size_kb": 64, 00:19:39.830 "state": "offline", 00:19:39.830 "raid_level": "concat", 00:19:39.830 "superblock": false, 00:19:39.830 "num_base_bdevs": 4, 00:19:39.830 "num_base_bdevs_discovered": 3, 00:19:39.830 "num_base_bdevs_operational": 3, 00:19:39.830 "base_bdevs_list": [ 00:19:39.830 { 00:19:39.830 "name": null, 00:19:39.830 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.830 "is_configured": false, 00:19:39.830 "data_offset": 0, 00:19:39.830 "data_size": 65536 00:19:39.830 }, 00:19:39.830 { 00:19:39.830 "name": "BaseBdev2", 00:19:39.830 "uuid": "737c2d2f-f9de-44ad-b14a-ae01faa1b5cc", 00:19:39.830 "is_configured": true, 00:19:39.830 "data_offset": 0, 00:19:39.830 "data_size": 65536 00:19:39.830 }, 00:19:39.830 { 00:19:39.830 "name": "BaseBdev3", 00:19:39.830 "uuid": "3e63ac6b-de76-4747-96b7-fe6747c25ab2", 00:19:39.830 "is_configured": true, 00:19:39.830 "data_offset": 0, 00:19:39.830 "data_size": 65536 00:19:39.830 }, 00:19:39.830 { 00:19:39.830 "name": "BaseBdev4", 00:19:39.830 "uuid": "f69c2188-e01a-46c7-9a86-3489bad1f01a", 00:19:39.830 "is_configured": true, 00:19:39.830 "data_offset": 0, 00:19:39.830 "data_size": 65536 00:19:39.830 } 00:19:39.830 ] 00:19:39.830 }' 00:19:39.830 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.830 15:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:40.400 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:40.660 [2024-07-12 15:56:00.944110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:40.660 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:40.660 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:40.660 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.660 15:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:40.920 [2024-07-12 15:56:01.331076] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.920 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:41.180 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:41.180 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:41.180 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:41.439 [2024-07-12 15:56:01.717944] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:41.439 [2024-07-12 15:56:01.717973] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cc1d0 name Existed_Raid, state offline 00:19:41.439 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:41.439 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:41.439 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:41.439 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:41.699 15:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:41.699 BaseBdev2 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:41.699 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:41.959 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:42.218 [ 00:19:42.218 { 00:19:42.218 "name": "BaseBdev2", 00:19:42.218 "aliases": [ 00:19:42.219 "a8728151-d937-4798-be30-67c248983879" 00:19:42.219 ], 00:19:42.219 "product_name": "Malloc disk", 00:19:42.219 "block_size": 512, 00:19:42.219 "num_blocks": 65536, 00:19:42.219 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:42.219 "assigned_rate_limits": { 00:19:42.219 "rw_ios_per_sec": 0, 00:19:42.219 "rw_mbytes_per_sec": 0, 00:19:42.219 "r_mbytes_per_sec": 0, 00:19:42.219 "w_mbytes_per_sec": 0 00:19:42.219 }, 00:19:42.219 "claimed": false, 00:19:42.219 "zoned": false, 00:19:42.219 "supported_io_types": { 00:19:42.219 "read": true, 00:19:42.219 "write": true, 00:19:42.219 "unmap": true, 00:19:42.219 "flush": true, 00:19:42.219 "reset": true, 00:19:42.219 "nvme_admin": false, 00:19:42.219 "nvme_io": false, 00:19:42.219 "nvme_io_md": false, 00:19:42.219 "write_zeroes": true, 00:19:42.219 "zcopy": true, 00:19:42.219 "get_zone_info": false, 00:19:42.219 "zone_management": false, 00:19:42.219 "zone_append": false, 00:19:42.219 "compare": false, 00:19:42.219 "compare_and_write": false, 00:19:42.219 "abort": true, 00:19:42.219 "seek_hole": false, 00:19:42.219 "seek_data": false, 00:19:42.219 "copy": true, 00:19:42.219 "nvme_iov_md": false 00:19:42.219 }, 00:19:42.219 "memory_domains": [ 00:19:42.219 { 00:19:42.219 "dma_device_id": "system", 00:19:42.219 "dma_device_type": 1 00:19:42.219 }, 00:19:42.219 { 00:19:42.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.219 "dma_device_type": 2 00:19:42.219 } 00:19:42.219 ], 00:19:42.219 "driver_specific": {} 00:19:42.219 } 00:19:42.219 ] 00:19:42.219 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:42.219 15:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:42.219 15:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:42.219 15:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:42.479 BaseBdev3 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:42.479 15:56:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:42.745 [ 00:19:42.745 { 00:19:42.745 "name": "BaseBdev3", 00:19:42.745 "aliases": [ 00:19:42.745 "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3" 00:19:42.745 ], 00:19:42.745 "product_name": "Malloc disk", 00:19:42.745 "block_size": 512, 00:19:42.745 "num_blocks": 65536, 00:19:42.745 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:42.745 "assigned_rate_limits": { 00:19:42.745 "rw_ios_per_sec": 0, 00:19:42.745 "rw_mbytes_per_sec": 0, 00:19:42.745 "r_mbytes_per_sec": 0, 00:19:42.745 "w_mbytes_per_sec": 0 00:19:42.745 }, 00:19:42.745 "claimed": false, 00:19:42.745 "zoned": false, 00:19:42.745 "supported_io_types": { 00:19:42.745 "read": true, 00:19:42.745 "write": true, 00:19:42.745 "unmap": true, 00:19:42.745 "flush": true, 00:19:42.745 "reset": true, 00:19:42.745 "nvme_admin": false, 00:19:42.745 "nvme_io": false, 00:19:42.745 "nvme_io_md": false, 00:19:42.745 "write_zeroes": true, 00:19:42.745 "zcopy": true, 00:19:42.745 "get_zone_info": false, 00:19:42.745 "zone_management": false, 00:19:42.745 "zone_append": false, 00:19:42.745 "compare": false, 00:19:42.745 "compare_and_write": false, 00:19:42.745 "abort": true, 00:19:42.745 "seek_hole": false, 00:19:42.745 "seek_data": false, 00:19:42.745 "copy": true, 00:19:42.745 "nvme_iov_md": false 00:19:42.745 }, 00:19:42.745 "memory_domains": [ 00:19:42.745 { 00:19:42.745 "dma_device_id": "system", 00:19:42.745 "dma_device_type": 1 00:19:42.745 }, 00:19:42.745 { 00:19:42.745 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.745 "dma_device_type": 2 00:19:42.745 } 00:19:42.745 ], 00:19:42.745 "driver_specific": {} 00:19:42.745 } 00:19:42.745 ] 00:19:42.745 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:42.745 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:42.745 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:42.745 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:43.011 BaseBdev4 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:43.011 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:43.270 [ 00:19:43.270 { 00:19:43.270 "name": "BaseBdev4", 00:19:43.270 "aliases": [ 00:19:43.270 "21ffb46e-0a9c-4558-8430-c4f8b7770e89" 00:19:43.270 ], 00:19:43.270 "product_name": "Malloc disk", 00:19:43.270 "block_size": 512, 00:19:43.270 "num_blocks": 65536, 00:19:43.270 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:43.270 "assigned_rate_limits": { 00:19:43.270 "rw_ios_per_sec": 0, 00:19:43.270 "rw_mbytes_per_sec": 0, 00:19:43.270 "r_mbytes_per_sec": 0, 00:19:43.270 "w_mbytes_per_sec": 0 00:19:43.270 }, 00:19:43.270 "claimed": false, 00:19:43.270 "zoned": false, 00:19:43.270 "supported_io_types": { 00:19:43.270 "read": true, 00:19:43.270 "write": true, 00:19:43.270 "unmap": true, 00:19:43.270 "flush": true, 00:19:43.270 "reset": true, 00:19:43.270 "nvme_admin": false, 00:19:43.270 "nvme_io": false, 00:19:43.270 "nvme_io_md": false, 00:19:43.270 "write_zeroes": true, 00:19:43.270 "zcopy": true, 00:19:43.270 "get_zone_info": false, 00:19:43.270 "zone_management": false, 00:19:43.270 "zone_append": false, 00:19:43.270 "compare": false, 00:19:43.270 "compare_and_write": false, 00:19:43.270 "abort": true, 00:19:43.270 "seek_hole": false, 00:19:43.270 "seek_data": false, 00:19:43.270 "copy": true, 00:19:43.270 "nvme_iov_md": false 00:19:43.270 }, 00:19:43.270 "memory_domains": [ 00:19:43.270 { 00:19:43.270 "dma_device_id": "system", 00:19:43.270 "dma_device_type": 1 00:19:43.270 }, 00:19:43.270 { 00:19:43.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.270 "dma_device_type": 2 00:19:43.270 } 00:19:43.270 ], 00:19:43.270 "driver_specific": {} 00:19:43.270 } 00:19:43.270 ] 00:19:43.270 15:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:43.271 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:43.271 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:43.271 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:43.531 [2024-07-12 15:56:03.797022] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:43.531 [2024-07-12 15:56:03.797048] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:43.531 [2024-07-12 15:56:03.797061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:43.531 [2024-07-12 15:56:03.798086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:43.531 [2024-07-12 15:56:03.798116] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.531 15:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.790 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.790 "name": "Existed_Raid", 00:19:43.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.790 "strip_size_kb": 64, 00:19:43.790 "state": "configuring", 00:19:43.790 "raid_level": "concat", 00:19:43.790 "superblock": false, 00:19:43.790 "num_base_bdevs": 4, 00:19:43.790 "num_base_bdevs_discovered": 3, 00:19:43.790 "num_base_bdevs_operational": 4, 00:19:43.790 "base_bdevs_list": [ 00:19:43.790 { 00:19:43.790 "name": "BaseBdev1", 00:19:43.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.790 "is_configured": false, 00:19:43.790 "data_offset": 0, 00:19:43.790 "data_size": 0 00:19:43.790 }, 00:19:43.790 { 00:19:43.790 "name": "BaseBdev2", 00:19:43.790 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:43.790 "is_configured": true, 00:19:43.790 "data_offset": 0, 00:19:43.790 "data_size": 65536 00:19:43.790 }, 00:19:43.790 { 00:19:43.790 "name": "BaseBdev3", 00:19:43.790 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:43.790 "is_configured": true, 00:19:43.790 "data_offset": 0, 00:19:43.790 "data_size": 65536 00:19:43.790 }, 00:19:43.790 { 00:19:43.790 "name": "BaseBdev4", 00:19:43.790 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:43.790 "is_configured": true, 00:19:43.790 "data_offset": 0, 00:19:43.790 "data_size": 65536 00:19:43.790 } 00:19:43.790 ] 00:19:43.790 }' 00:19:43.790 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.790 15:56:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:44.360 [2024-07-12 15:56:04.731358] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.360 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.644 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.644 "name": "Existed_Raid", 00:19:44.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.644 "strip_size_kb": 64, 00:19:44.644 "state": "configuring", 00:19:44.644 "raid_level": "concat", 00:19:44.644 "superblock": false, 00:19:44.644 "num_base_bdevs": 4, 00:19:44.644 "num_base_bdevs_discovered": 2, 00:19:44.644 "num_base_bdevs_operational": 4, 00:19:44.644 "base_bdevs_list": [ 00:19:44.644 { 00:19:44.644 "name": "BaseBdev1", 00:19:44.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.644 "is_configured": false, 00:19:44.644 "data_offset": 0, 00:19:44.644 "data_size": 0 00:19:44.644 }, 00:19:44.644 { 00:19:44.644 "name": null, 00:19:44.644 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:44.644 "is_configured": false, 00:19:44.644 "data_offset": 0, 00:19:44.644 "data_size": 65536 00:19:44.644 }, 00:19:44.644 { 00:19:44.644 "name": "BaseBdev3", 00:19:44.644 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:44.644 "is_configured": true, 00:19:44.644 "data_offset": 0, 00:19:44.644 "data_size": 65536 00:19:44.644 }, 00:19:44.644 { 00:19:44.644 "name": "BaseBdev4", 00:19:44.644 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:44.644 "is_configured": true, 00:19:44.644 "data_offset": 0, 00:19:44.644 "data_size": 65536 00:19:44.644 } 00:19:44.644 ] 00:19:44.644 }' 00:19:44.644 15:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.644 15:56:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.581 15:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.581 15:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:45.840 [2024-07-12 15:56:06.224222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:45.840 BaseBdev1 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:45.840 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.109 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:46.368 [ 00:19:46.368 { 00:19:46.368 "name": "BaseBdev1", 00:19:46.368 "aliases": [ 00:19:46.368 "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566" 00:19:46.368 ], 00:19:46.368 "product_name": "Malloc disk", 00:19:46.368 "block_size": 512, 00:19:46.368 "num_blocks": 65536, 00:19:46.368 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:46.368 "assigned_rate_limits": { 00:19:46.368 "rw_ios_per_sec": 0, 00:19:46.368 "rw_mbytes_per_sec": 0, 00:19:46.368 "r_mbytes_per_sec": 0, 00:19:46.368 "w_mbytes_per_sec": 0 00:19:46.368 }, 00:19:46.368 "claimed": true, 00:19:46.368 "claim_type": "exclusive_write", 00:19:46.368 "zoned": false, 00:19:46.368 "supported_io_types": { 00:19:46.368 "read": true, 00:19:46.368 "write": true, 00:19:46.368 "unmap": true, 00:19:46.368 "flush": true, 00:19:46.368 "reset": true, 00:19:46.368 "nvme_admin": false, 00:19:46.368 "nvme_io": false, 00:19:46.368 "nvme_io_md": false, 00:19:46.368 "write_zeroes": true, 00:19:46.368 "zcopy": true, 00:19:46.368 "get_zone_info": false, 00:19:46.368 "zone_management": false, 00:19:46.368 "zone_append": false, 00:19:46.368 "compare": false, 00:19:46.368 "compare_and_write": false, 00:19:46.368 "abort": true, 00:19:46.368 "seek_hole": false, 00:19:46.368 "seek_data": false, 00:19:46.368 "copy": true, 00:19:46.368 "nvme_iov_md": false 00:19:46.368 }, 00:19:46.368 "memory_domains": [ 00:19:46.368 { 00:19:46.368 "dma_device_id": "system", 00:19:46.368 "dma_device_type": 1 00:19:46.368 }, 00:19:46.368 { 00:19:46.368 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.368 "dma_device_type": 2 00:19:46.368 } 00:19:46.368 ], 00:19:46.368 "driver_specific": {} 00:19:46.368 } 00:19:46.368 ] 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.368 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.628 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.628 "name": "Existed_Raid", 00:19:46.628 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.628 "strip_size_kb": 64, 00:19:46.628 "state": "configuring", 00:19:46.628 "raid_level": "concat", 00:19:46.628 "superblock": false, 00:19:46.628 "num_base_bdevs": 4, 00:19:46.628 "num_base_bdevs_discovered": 3, 00:19:46.628 "num_base_bdevs_operational": 4, 00:19:46.628 "base_bdevs_list": [ 00:19:46.628 { 00:19:46.628 "name": "BaseBdev1", 00:19:46.628 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:46.628 "is_configured": true, 00:19:46.628 "data_offset": 0, 00:19:46.628 "data_size": 65536 00:19:46.628 }, 00:19:46.628 { 00:19:46.628 "name": null, 00:19:46.628 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:46.628 "is_configured": false, 00:19:46.628 "data_offset": 0, 00:19:46.628 "data_size": 65536 00:19:46.628 }, 00:19:46.628 { 00:19:46.628 "name": "BaseBdev3", 00:19:46.628 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:46.628 "is_configured": true, 00:19:46.628 "data_offset": 0, 00:19:46.628 "data_size": 65536 00:19:46.628 }, 00:19:46.628 { 00:19:46.628 "name": "BaseBdev4", 00:19:46.628 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:46.628 "is_configured": true, 00:19:46.628 "data_offset": 0, 00:19:46.628 "data_size": 65536 00:19:46.628 } 00:19:46.628 ] 00:19:46.628 }' 00:19:46.628 15:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.628 15:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.232 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.232 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:47.232 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:47.232 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:47.491 [2024-07-12 15:56:07.772178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.491 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.492 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:47.751 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.751 "name": "Existed_Raid", 00:19:47.751 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.751 "strip_size_kb": 64, 00:19:47.751 "state": "configuring", 00:19:47.751 "raid_level": "concat", 00:19:47.751 "superblock": false, 00:19:47.751 "num_base_bdevs": 4, 00:19:47.751 "num_base_bdevs_discovered": 2, 00:19:47.751 "num_base_bdevs_operational": 4, 00:19:47.751 "base_bdevs_list": [ 00:19:47.751 { 00:19:47.751 "name": "BaseBdev1", 00:19:47.751 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:47.751 "is_configured": true, 00:19:47.751 "data_offset": 0, 00:19:47.751 "data_size": 65536 00:19:47.751 }, 00:19:47.751 { 00:19:47.751 "name": null, 00:19:47.751 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:47.751 "is_configured": false, 00:19:47.751 "data_offset": 0, 00:19:47.751 "data_size": 65536 00:19:47.751 }, 00:19:47.751 { 00:19:47.751 "name": null, 00:19:47.751 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:47.751 "is_configured": false, 00:19:47.751 "data_offset": 0, 00:19:47.751 "data_size": 65536 00:19:47.751 }, 00:19:47.751 { 00:19:47.751 "name": "BaseBdev4", 00:19:47.751 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:47.751 "is_configured": true, 00:19:47.751 "data_offset": 0, 00:19:47.751 "data_size": 65536 00:19:47.751 } 00:19:47.751 ] 00:19:47.751 }' 00:19:47.751 15:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.751 15:56:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.322 15:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.322 15:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:48.892 [2024-07-12 15:56:09.267997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.892 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:49.152 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.152 "name": "Existed_Raid", 00:19:49.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.152 "strip_size_kb": 64, 00:19:49.152 "state": "configuring", 00:19:49.152 "raid_level": "concat", 00:19:49.152 "superblock": false, 00:19:49.152 "num_base_bdevs": 4, 00:19:49.152 "num_base_bdevs_discovered": 3, 00:19:49.152 "num_base_bdevs_operational": 4, 00:19:49.152 "base_bdevs_list": [ 00:19:49.152 { 00:19:49.152 "name": "BaseBdev1", 00:19:49.152 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:49.152 "is_configured": true, 00:19:49.152 "data_offset": 0, 00:19:49.152 "data_size": 65536 00:19:49.152 }, 00:19:49.152 { 00:19:49.152 "name": null, 00:19:49.152 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:49.152 "is_configured": false, 00:19:49.152 "data_offset": 0, 00:19:49.152 "data_size": 65536 00:19:49.152 }, 00:19:49.152 { 00:19:49.152 "name": "BaseBdev3", 00:19:49.152 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:49.152 "is_configured": true, 00:19:49.152 "data_offset": 0, 00:19:49.152 "data_size": 65536 00:19:49.152 }, 00:19:49.152 { 00:19:49.152 "name": "BaseBdev4", 00:19:49.152 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:49.152 "is_configured": true, 00:19:49.152 "data_offset": 0, 00:19:49.152 "data_size": 65536 00:19:49.152 } 00:19:49.152 ] 00:19:49.152 }' 00:19:49.152 15:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.152 15:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.092 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.092 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:50.092 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:50.092 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:50.352 [2024-07-12 15:56:10.563272] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.352 "name": "Existed_Raid", 00:19:50.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.352 "strip_size_kb": 64, 00:19:50.352 "state": "configuring", 00:19:50.352 "raid_level": "concat", 00:19:50.352 "superblock": false, 00:19:50.352 "num_base_bdevs": 4, 00:19:50.352 "num_base_bdevs_discovered": 2, 00:19:50.352 "num_base_bdevs_operational": 4, 00:19:50.352 "base_bdevs_list": [ 00:19:50.352 { 00:19:50.352 "name": null, 00:19:50.352 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:50.352 "is_configured": false, 00:19:50.352 "data_offset": 0, 00:19:50.352 "data_size": 65536 00:19:50.352 }, 00:19:50.352 { 00:19:50.352 "name": null, 00:19:50.352 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:50.352 "is_configured": false, 00:19:50.352 "data_offset": 0, 00:19:50.352 "data_size": 65536 00:19:50.352 }, 00:19:50.352 { 00:19:50.352 "name": "BaseBdev3", 00:19:50.352 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:50.352 "is_configured": true, 00:19:50.352 "data_offset": 0, 00:19:50.352 "data_size": 65536 00:19:50.352 }, 00:19:50.352 { 00:19:50.352 "name": "BaseBdev4", 00:19:50.352 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:50.352 "is_configured": true, 00:19:50.352 "data_offset": 0, 00:19:50.352 "data_size": 65536 00:19:50.352 } 00:19:50.352 ] 00:19:50.352 }' 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.352 15:56:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:50.921 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.921 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:51.181 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:51.181 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:51.441 [2024-07-12 15:56:11.700024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.441 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.701 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.701 "name": "Existed_Raid", 00:19:51.701 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.701 "strip_size_kb": 64, 00:19:51.701 "state": "configuring", 00:19:51.701 "raid_level": "concat", 00:19:51.701 "superblock": false, 00:19:51.701 "num_base_bdevs": 4, 00:19:51.701 "num_base_bdevs_discovered": 3, 00:19:51.701 "num_base_bdevs_operational": 4, 00:19:51.701 "base_bdevs_list": [ 00:19:51.701 { 00:19:51.701 "name": null, 00:19:51.701 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:51.701 "is_configured": false, 00:19:51.701 "data_offset": 0, 00:19:51.701 "data_size": 65536 00:19:51.701 }, 00:19:51.701 { 00:19:51.701 "name": "BaseBdev2", 00:19:51.701 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:51.701 "is_configured": true, 00:19:51.701 "data_offset": 0, 00:19:51.701 "data_size": 65536 00:19:51.701 }, 00:19:51.701 { 00:19:51.701 "name": "BaseBdev3", 00:19:51.701 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:51.701 "is_configured": true, 00:19:51.701 "data_offset": 0, 00:19:51.701 "data_size": 65536 00:19:51.701 }, 00:19:51.701 { 00:19:51.701 "name": "BaseBdev4", 00:19:51.701 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:51.701 "is_configured": true, 00:19:51.701 "data_offset": 0, 00:19:51.701 "data_size": 65536 00:19:51.701 } 00:19:51.701 ] 00:19:51.701 }' 00:19:51.701 15:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.702 15:56:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.272 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.272 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:52.272 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:52.272 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.272 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:52.532 15:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c11aaa32-87fd-4d9d-b0b2-a4c2e3400566 00:19:52.791 [2024-07-12 15:56:13.036390] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:52.791 [2024-07-12 15:56:13.036417] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14ce8c0 00:19:52.791 [2024-07-12 15:56:13.036421] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:52.791 [2024-07-12 15:56:13.036579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14d34c0 00:19:52.791 [2024-07-12 15:56:13.036670] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14ce8c0 00:19:52.791 [2024-07-12 15:56:13.036676] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x14ce8c0 00:19:52.791 [2024-07-12 15:56:13.036798] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:52.791 NewBaseBdev 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:52.791 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:53.051 [ 00:19:53.051 { 00:19:53.051 "name": "NewBaseBdev", 00:19:53.051 "aliases": [ 00:19:53.051 "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566" 00:19:53.051 ], 00:19:53.051 "product_name": "Malloc disk", 00:19:53.051 "block_size": 512, 00:19:53.051 "num_blocks": 65536, 00:19:53.051 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:53.051 "assigned_rate_limits": { 00:19:53.051 "rw_ios_per_sec": 0, 00:19:53.051 "rw_mbytes_per_sec": 0, 00:19:53.051 "r_mbytes_per_sec": 0, 00:19:53.051 "w_mbytes_per_sec": 0 00:19:53.051 }, 00:19:53.051 "claimed": true, 00:19:53.051 "claim_type": "exclusive_write", 00:19:53.051 "zoned": false, 00:19:53.051 "supported_io_types": { 00:19:53.051 "read": true, 00:19:53.051 "write": true, 00:19:53.051 "unmap": true, 00:19:53.051 "flush": true, 00:19:53.051 "reset": true, 00:19:53.051 "nvme_admin": false, 00:19:53.051 "nvme_io": false, 00:19:53.051 "nvme_io_md": false, 00:19:53.051 "write_zeroes": true, 00:19:53.051 "zcopy": true, 00:19:53.051 "get_zone_info": false, 00:19:53.051 "zone_management": false, 00:19:53.051 "zone_append": false, 00:19:53.051 "compare": false, 00:19:53.051 "compare_and_write": false, 00:19:53.051 "abort": true, 00:19:53.051 "seek_hole": false, 00:19:53.051 "seek_data": false, 00:19:53.051 "copy": true, 00:19:53.051 "nvme_iov_md": false 00:19:53.051 }, 00:19:53.051 "memory_domains": [ 00:19:53.051 { 00:19:53.051 "dma_device_id": "system", 00:19:53.051 "dma_device_type": 1 00:19:53.051 }, 00:19:53.051 { 00:19:53.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.051 "dma_device_type": 2 00:19:53.051 } 00:19:53.051 ], 00:19:53.051 "driver_specific": {} 00:19:53.051 } 00:19:53.051 ] 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.051 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.311 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.311 "name": "Existed_Raid", 00:19:53.311 "uuid": "d2f78736-f904-466d-98d7-3478aa6d53e8", 00:19:53.311 "strip_size_kb": 64, 00:19:53.311 "state": "online", 00:19:53.311 "raid_level": "concat", 00:19:53.311 "superblock": false, 00:19:53.311 "num_base_bdevs": 4, 00:19:53.311 "num_base_bdevs_discovered": 4, 00:19:53.311 "num_base_bdevs_operational": 4, 00:19:53.311 "base_bdevs_list": [ 00:19:53.311 { 00:19:53.311 "name": "NewBaseBdev", 00:19:53.311 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:53.311 "is_configured": true, 00:19:53.311 "data_offset": 0, 00:19:53.311 "data_size": 65536 00:19:53.311 }, 00:19:53.311 { 00:19:53.311 "name": "BaseBdev2", 00:19:53.311 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:53.311 "is_configured": true, 00:19:53.311 "data_offset": 0, 00:19:53.311 "data_size": 65536 00:19:53.311 }, 00:19:53.311 { 00:19:53.311 "name": "BaseBdev3", 00:19:53.311 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:53.311 "is_configured": true, 00:19:53.311 "data_offset": 0, 00:19:53.311 "data_size": 65536 00:19:53.311 }, 00:19:53.311 { 00:19:53.311 "name": "BaseBdev4", 00:19:53.311 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:53.311 "is_configured": true, 00:19:53.311 "data_offset": 0, 00:19:53.311 "data_size": 65536 00:19:53.311 } 00:19:53.311 ] 00:19:53.311 }' 00:19:53.311 15:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.312 15:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:53.881 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:54.142 [2024-07-12 15:56:14.343998] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:54.142 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:54.142 "name": "Existed_Raid", 00:19:54.142 "aliases": [ 00:19:54.142 "d2f78736-f904-466d-98d7-3478aa6d53e8" 00:19:54.142 ], 00:19:54.142 "product_name": "Raid Volume", 00:19:54.142 "block_size": 512, 00:19:54.142 "num_blocks": 262144, 00:19:54.142 "uuid": "d2f78736-f904-466d-98d7-3478aa6d53e8", 00:19:54.142 "assigned_rate_limits": { 00:19:54.142 "rw_ios_per_sec": 0, 00:19:54.142 "rw_mbytes_per_sec": 0, 00:19:54.142 "r_mbytes_per_sec": 0, 00:19:54.142 "w_mbytes_per_sec": 0 00:19:54.142 }, 00:19:54.142 "claimed": false, 00:19:54.142 "zoned": false, 00:19:54.142 "supported_io_types": { 00:19:54.142 "read": true, 00:19:54.142 "write": true, 00:19:54.142 "unmap": true, 00:19:54.142 "flush": true, 00:19:54.142 "reset": true, 00:19:54.142 "nvme_admin": false, 00:19:54.142 "nvme_io": false, 00:19:54.142 "nvme_io_md": false, 00:19:54.142 "write_zeroes": true, 00:19:54.142 "zcopy": false, 00:19:54.142 "get_zone_info": false, 00:19:54.142 "zone_management": false, 00:19:54.142 "zone_append": false, 00:19:54.142 "compare": false, 00:19:54.142 "compare_and_write": false, 00:19:54.142 "abort": false, 00:19:54.142 "seek_hole": false, 00:19:54.142 "seek_data": false, 00:19:54.142 "copy": false, 00:19:54.142 "nvme_iov_md": false 00:19:54.142 }, 00:19:54.142 "memory_domains": [ 00:19:54.142 { 00:19:54.142 "dma_device_id": "system", 00:19:54.142 "dma_device_type": 1 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.142 "dma_device_type": 2 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "system", 00:19:54.142 "dma_device_type": 1 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.142 "dma_device_type": 2 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "system", 00:19:54.142 "dma_device_type": 1 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.142 "dma_device_type": 2 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "system", 00:19:54.142 "dma_device_type": 1 00:19:54.142 }, 00:19:54.142 { 00:19:54.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.142 "dma_device_type": 2 00:19:54.142 } 00:19:54.142 ], 00:19:54.142 "driver_specific": { 00:19:54.142 "raid": { 00:19:54.142 "uuid": "d2f78736-f904-466d-98d7-3478aa6d53e8", 00:19:54.142 "strip_size_kb": 64, 00:19:54.142 "state": "online", 00:19:54.142 "raid_level": "concat", 00:19:54.142 "superblock": false, 00:19:54.142 "num_base_bdevs": 4, 00:19:54.142 "num_base_bdevs_discovered": 4, 00:19:54.142 "num_base_bdevs_operational": 4, 00:19:54.142 "base_bdevs_list": [ 00:19:54.142 { 00:19:54.142 "name": "NewBaseBdev", 00:19:54.142 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:54.142 "is_configured": true, 00:19:54.142 "data_offset": 0, 00:19:54.142 "data_size": 65536 00:19:54.142 }, 00:19:54.142 { 00:19:54.143 "name": "BaseBdev2", 00:19:54.143 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:54.143 "is_configured": true, 00:19:54.143 "data_offset": 0, 00:19:54.143 "data_size": 65536 00:19:54.143 }, 00:19:54.143 { 00:19:54.143 "name": "BaseBdev3", 00:19:54.143 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:54.143 "is_configured": true, 00:19:54.143 "data_offset": 0, 00:19:54.143 "data_size": 65536 00:19:54.143 }, 00:19:54.143 { 00:19:54.143 "name": "BaseBdev4", 00:19:54.143 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:54.143 "is_configured": true, 00:19:54.143 "data_offset": 0, 00:19:54.143 "data_size": 65536 00:19:54.143 } 00:19:54.143 ] 00:19:54.143 } 00:19:54.143 } 00:19:54.143 }' 00:19:54.143 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:54.143 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:54.143 BaseBdev2 00:19:54.143 BaseBdev3 00:19:54.143 BaseBdev4' 00:19:54.143 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.143 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:54.143 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.403 "name": "NewBaseBdev", 00:19:54.403 "aliases": [ 00:19:54.403 "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566" 00:19:54.403 ], 00:19:54.403 "product_name": "Malloc disk", 00:19:54.403 "block_size": 512, 00:19:54.403 "num_blocks": 65536, 00:19:54.403 "uuid": "c11aaa32-87fd-4d9d-b0b2-a4c2e3400566", 00:19:54.403 "assigned_rate_limits": { 00:19:54.403 "rw_ios_per_sec": 0, 00:19:54.403 "rw_mbytes_per_sec": 0, 00:19:54.403 "r_mbytes_per_sec": 0, 00:19:54.403 "w_mbytes_per_sec": 0 00:19:54.403 }, 00:19:54.403 "claimed": true, 00:19:54.403 "claim_type": "exclusive_write", 00:19:54.403 "zoned": false, 00:19:54.403 "supported_io_types": { 00:19:54.403 "read": true, 00:19:54.403 "write": true, 00:19:54.403 "unmap": true, 00:19:54.403 "flush": true, 00:19:54.403 "reset": true, 00:19:54.403 "nvme_admin": false, 00:19:54.403 "nvme_io": false, 00:19:54.403 "nvme_io_md": false, 00:19:54.403 "write_zeroes": true, 00:19:54.403 "zcopy": true, 00:19:54.403 "get_zone_info": false, 00:19:54.403 "zone_management": false, 00:19:54.403 "zone_append": false, 00:19:54.403 "compare": false, 00:19:54.403 "compare_and_write": false, 00:19:54.403 "abort": true, 00:19:54.403 "seek_hole": false, 00:19:54.403 "seek_data": false, 00:19:54.403 "copy": true, 00:19:54.403 "nvme_iov_md": false 00:19:54.403 }, 00:19:54.403 "memory_domains": [ 00:19:54.403 { 00:19:54.403 "dma_device_id": "system", 00:19:54.403 "dma_device_type": 1 00:19:54.403 }, 00:19:54.403 { 00:19:54.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.403 "dma_device_type": 2 00:19:54.403 } 00:19:54.403 ], 00:19:54.403 "driver_specific": {} 00:19:54.403 }' 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:54.403 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.663 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:54.663 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:54.663 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:54.663 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:54.663 15:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:54.923 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:54.923 "name": "BaseBdev2", 00:19:54.923 "aliases": [ 00:19:54.923 "a8728151-d937-4798-be30-67c248983879" 00:19:54.923 ], 00:19:54.923 "product_name": "Malloc disk", 00:19:54.923 "block_size": 512, 00:19:54.923 "num_blocks": 65536, 00:19:54.923 "uuid": "a8728151-d937-4798-be30-67c248983879", 00:19:54.923 "assigned_rate_limits": { 00:19:54.923 "rw_ios_per_sec": 0, 00:19:54.923 "rw_mbytes_per_sec": 0, 00:19:54.923 "r_mbytes_per_sec": 0, 00:19:54.923 "w_mbytes_per_sec": 0 00:19:54.923 }, 00:19:54.923 "claimed": true, 00:19:54.923 "claim_type": "exclusive_write", 00:19:54.923 "zoned": false, 00:19:54.923 "supported_io_types": { 00:19:54.923 "read": true, 00:19:54.923 "write": true, 00:19:54.923 "unmap": true, 00:19:54.923 "flush": true, 00:19:54.923 "reset": true, 00:19:54.923 "nvme_admin": false, 00:19:54.923 "nvme_io": false, 00:19:54.923 "nvme_io_md": false, 00:19:54.923 "write_zeroes": true, 00:19:54.923 "zcopy": true, 00:19:54.923 "get_zone_info": false, 00:19:54.923 "zone_management": false, 00:19:54.923 "zone_append": false, 00:19:54.923 "compare": false, 00:19:54.923 "compare_and_write": false, 00:19:54.923 "abort": true, 00:19:54.923 "seek_hole": false, 00:19:54.923 "seek_data": false, 00:19:54.923 "copy": true, 00:19:54.923 "nvme_iov_md": false 00:19:54.923 }, 00:19:54.923 "memory_domains": [ 00:19:54.923 { 00:19:54.923 "dma_device_id": "system", 00:19:54.923 "dma_device_type": 1 00:19:54.923 }, 00:19:54.923 { 00:19:54.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.923 "dma_device_type": 2 00:19:54.923 } 00:19:54.923 ], 00:19:54.923 "driver_specific": {} 00:19:54.923 }' 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:54.924 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:55.184 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.443 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.443 "name": "BaseBdev3", 00:19:55.443 "aliases": [ 00:19:55.443 "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3" 00:19:55.444 ], 00:19:55.444 "product_name": "Malloc disk", 00:19:55.444 "block_size": 512, 00:19:55.444 "num_blocks": 65536, 00:19:55.444 "uuid": "736a15b2-3ae8-4ee0-9b6c-4d67d4bfffe3", 00:19:55.444 "assigned_rate_limits": { 00:19:55.444 "rw_ios_per_sec": 0, 00:19:55.444 "rw_mbytes_per_sec": 0, 00:19:55.444 "r_mbytes_per_sec": 0, 00:19:55.444 "w_mbytes_per_sec": 0 00:19:55.444 }, 00:19:55.444 "claimed": true, 00:19:55.444 "claim_type": "exclusive_write", 00:19:55.444 "zoned": false, 00:19:55.444 "supported_io_types": { 00:19:55.444 "read": true, 00:19:55.444 "write": true, 00:19:55.444 "unmap": true, 00:19:55.444 "flush": true, 00:19:55.444 "reset": true, 00:19:55.444 "nvme_admin": false, 00:19:55.444 "nvme_io": false, 00:19:55.444 "nvme_io_md": false, 00:19:55.444 "write_zeroes": true, 00:19:55.444 "zcopy": true, 00:19:55.444 "get_zone_info": false, 00:19:55.444 "zone_management": false, 00:19:55.444 "zone_append": false, 00:19:55.444 "compare": false, 00:19:55.444 "compare_and_write": false, 00:19:55.444 "abort": true, 00:19:55.444 "seek_hole": false, 00:19:55.444 "seek_data": false, 00:19:55.444 "copy": true, 00:19:55.444 "nvme_iov_md": false 00:19:55.444 }, 00:19:55.444 "memory_domains": [ 00:19:55.444 { 00:19:55.444 "dma_device_id": "system", 00:19:55.444 "dma_device_type": 1 00:19:55.444 }, 00:19:55.444 { 00:19:55.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.444 "dma_device_type": 2 00:19:55.444 } 00:19:55.444 ], 00:19:55.444 "driver_specific": {} 00:19:55.444 }' 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.444 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:55.704 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:55.704 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.704 15:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:55.704 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:55.704 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:55.704 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:55.704 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:55.964 "name": "BaseBdev4", 00:19:55.964 "aliases": [ 00:19:55.964 "21ffb46e-0a9c-4558-8430-c4f8b7770e89" 00:19:55.964 ], 00:19:55.964 "product_name": "Malloc disk", 00:19:55.964 "block_size": 512, 00:19:55.964 "num_blocks": 65536, 00:19:55.964 "uuid": "21ffb46e-0a9c-4558-8430-c4f8b7770e89", 00:19:55.964 "assigned_rate_limits": { 00:19:55.964 "rw_ios_per_sec": 0, 00:19:55.964 "rw_mbytes_per_sec": 0, 00:19:55.964 "r_mbytes_per_sec": 0, 00:19:55.964 "w_mbytes_per_sec": 0 00:19:55.964 }, 00:19:55.964 "claimed": true, 00:19:55.964 "claim_type": "exclusive_write", 00:19:55.964 "zoned": false, 00:19:55.964 "supported_io_types": { 00:19:55.964 "read": true, 00:19:55.964 "write": true, 00:19:55.964 "unmap": true, 00:19:55.964 "flush": true, 00:19:55.964 "reset": true, 00:19:55.964 "nvme_admin": false, 00:19:55.964 "nvme_io": false, 00:19:55.964 "nvme_io_md": false, 00:19:55.964 "write_zeroes": true, 00:19:55.964 "zcopy": true, 00:19:55.964 "get_zone_info": false, 00:19:55.964 "zone_management": false, 00:19:55.964 "zone_append": false, 00:19:55.964 "compare": false, 00:19:55.964 "compare_and_write": false, 00:19:55.964 "abort": true, 00:19:55.964 "seek_hole": false, 00:19:55.964 "seek_data": false, 00:19:55.964 "copy": true, 00:19:55.964 "nvme_iov_md": false 00:19:55.964 }, 00:19:55.964 "memory_domains": [ 00:19:55.964 { 00:19:55.964 "dma_device_id": "system", 00:19:55.964 "dma_device_type": 1 00:19:55.964 }, 00:19:55.964 { 00:19:55.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.964 "dma_device_type": 2 00:19:55.964 } 00:19:55.964 ], 00:19:55.964 "driver_specific": {} 00:19:55.964 }' 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:55.964 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:56.223 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:56.483 [2024-07-12 15:56:16.705713] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:56.483 [2024-07-12 15:56:16.705738] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:56.483 [2024-07-12 15:56:16.705780] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:56.483 [2024-07-12 15:56:16.705826] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:56.483 [2024-07-12 15:56:16.705833] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14ce8c0 name Existed_Raid, state offline 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2584957 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2584957 ']' 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2584957 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2584957 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2584957' 00:19:56.483 killing process with pid 2584957 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2584957 00:19:56.483 [2024-07-12 15:56:16.774402] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2584957 00:19:56.483 [2024-07-12 15:56:16.794846] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:56.483 00:19:56.483 real 0m28.055s 00:19:56.483 user 0m52.711s 00:19:56.483 sys 0m4.086s 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:56.483 15:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:56.483 ************************************ 00:19:56.483 END TEST raid_state_function_test 00:19:56.483 ************************************ 00:19:56.744 15:56:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:56.744 15:56:16 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:56.744 15:56:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:56.744 15:56:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:56.744 15:56:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:56.744 ************************************ 00:19:56.744 START TEST raid_state_function_test_sb 00:19:56.744 ************************************ 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2590229 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2590229' 00:19:56.744 Process raid pid: 2590229 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2590229 /var/tmp/spdk-raid.sock 00:19:56.744 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2590229 ']' 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:56.744 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:56.744 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:56.744 [2024-07-12 15:56:17.052392] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:19:56.744 [2024-07-12 15:56:17.052437] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:56.745 [2024-07-12 15:56:17.142009] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:57.004 [2024-07-12 15:56:17.204941] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:57.004 [2024-07-12 15:56:17.243643] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:57.004 [2024-07-12 15:56:17.243664] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:57.574 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:57.574 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:57.574 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:57.834 [2024-07-12 15:56:18.058600] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:57.834 [2024-07-12 15:56:18.058627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:57.834 [2024-07-12 15:56:18.058633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:57.834 [2024-07-12 15:56:18.058639] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:57.834 [2024-07-12 15:56:18.058643] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:57.834 [2024-07-12 15:56:18.058649] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:57.834 [2024-07-12 15:56:18.058653] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:57.834 [2024-07-12 15:56:18.058658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.834 "name": "Existed_Raid", 00:19:57.834 "uuid": "8adbad4b-9afa-4774-93cd-801c6ea726ab", 00:19:57.834 "strip_size_kb": 64, 00:19:57.834 "state": "configuring", 00:19:57.834 "raid_level": "concat", 00:19:57.834 "superblock": true, 00:19:57.834 "num_base_bdevs": 4, 00:19:57.834 "num_base_bdevs_discovered": 0, 00:19:57.834 "num_base_bdevs_operational": 4, 00:19:57.834 "base_bdevs_list": [ 00:19:57.834 { 00:19:57.834 "name": "BaseBdev1", 00:19:57.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.834 "is_configured": false, 00:19:57.834 "data_offset": 0, 00:19:57.834 "data_size": 0 00:19:57.834 }, 00:19:57.834 { 00:19:57.834 "name": "BaseBdev2", 00:19:57.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.834 "is_configured": false, 00:19:57.834 "data_offset": 0, 00:19:57.834 "data_size": 0 00:19:57.834 }, 00:19:57.834 { 00:19:57.834 "name": "BaseBdev3", 00:19:57.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.834 "is_configured": false, 00:19:57.834 "data_offset": 0, 00:19:57.834 "data_size": 0 00:19:57.834 }, 00:19:57.834 { 00:19:57.834 "name": "BaseBdev4", 00:19:57.834 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.834 "is_configured": false, 00:19:57.834 "data_offset": 0, 00:19:57.834 "data_size": 0 00:19:57.834 } 00:19:57.834 ] 00:19:57.834 }' 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.834 15:56:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:58.402 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:58.662 [2024-07-12 15:56:18.984828] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:58.662 [2024-07-12 15:56:18.984845] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1131920 name Existed_Raid, state configuring 00:19:58.662 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:58.922 [2024-07-12 15:56:19.177342] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:58.922 [2024-07-12 15:56:19.177362] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:58.922 [2024-07-12 15:56:19.177367] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:58.922 [2024-07-12 15:56:19.177373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:58.922 [2024-07-12 15:56:19.177377] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:58.922 [2024-07-12 15:56:19.177383] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:58.922 [2024-07-12 15:56:19.177387] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:58.922 [2024-07-12 15:56:19.177393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:58.922 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:58.922 [2024-07-12 15:56:19.368341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:58.922 BaseBdev1 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:59.182 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:59.442 [ 00:19:59.442 { 00:19:59.442 "name": "BaseBdev1", 00:19:59.442 "aliases": [ 00:19:59.442 "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3" 00:19:59.442 ], 00:19:59.442 "product_name": "Malloc disk", 00:19:59.442 "block_size": 512, 00:19:59.442 "num_blocks": 65536, 00:19:59.442 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:19:59.442 "assigned_rate_limits": { 00:19:59.442 "rw_ios_per_sec": 0, 00:19:59.442 "rw_mbytes_per_sec": 0, 00:19:59.442 "r_mbytes_per_sec": 0, 00:19:59.442 "w_mbytes_per_sec": 0 00:19:59.442 }, 00:19:59.442 "claimed": true, 00:19:59.442 "claim_type": "exclusive_write", 00:19:59.442 "zoned": false, 00:19:59.442 "supported_io_types": { 00:19:59.442 "read": true, 00:19:59.442 "write": true, 00:19:59.442 "unmap": true, 00:19:59.442 "flush": true, 00:19:59.442 "reset": true, 00:19:59.442 "nvme_admin": false, 00:19:59.442 "nvme_io": false, 00:19:59.442 "nvme_io_md": false, 00:19:59.442 "write_zeroes": true, 00:19:59.442 "zcopy": true, 00:19:59.442 "get_zone_info": false, 00:19:59.442 "zone_management": false, 00:19:59.442 "zone_append": false, 00:19:59.442 "compare": false, 00:19:59.442 "compare_and_write": false, 00:19:59.442 "abort": true, 00:19:59.443 "seek_hole": false, 00:19:59.443 "seek_data": false, 00:19:59.443 "copy": true, 00:19:59.443 "nvme_iov_md": false 00:19:59.443 }, 00:19:59.443 "memory_domains": [ 00:19:59.443 { 00:19:59.443 "dma_device_id": "system", 00:19:59.443 "dma_device_type": 1 00:19:59.443 }, 00:19:59.443 { 00:19:59.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.443 "dma_device_type": 2 00:19:59.443 } 00:19:59.443 ], 00:19:59.443 "driver_specific": {} 00:19:59.443 } 00:19:59.443 ] 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.443 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:59.703 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.703 "name": "Existed_Raid", 00:19:59.703 "uuid": "f3babfbb-bfef-4249-8cc5-31365ad4475f", 00:19:59.703 "strip_size_kb": 64, 00:19:59.703 "state": "configuring", 00:19:59.703 "raid_level": "concat", 00:19:59.703 "superblock": true, 00:19:59.703 "num_base_bdevs": 4, 00:19:59.703 "num_base_bdevs_discovered": 1, 00:19:59.703 "num_base_bdevs_operational": 4, 00:19:59.703 "base_bdevs_list": [ 00:19:59.703 { 00:19:59.703 "name": "BaseBdev1", 00:19:59.703 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:19:59.703 "is_configured": true, 00:19:59.703 "data_offset": 2048, 00:19:59.703 "data_size": 63488 00:19:59.703 }, 00:19:59.703 { 00:19:59.703 "name": "BaseBdev2", 00:19:59.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.703 "is_configured": false, 00:19:59.703 "data_offset": 0, 00:19:59.703 "data_size": 0 00:19:59.703 }, 00:19:59.703 { 00:19:59.703 "name": "BaseBdev3", 00:19:59.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.703 "is_configured": false, 00:19:59.703 "data_offset": 0, 00:19:59.703 "data_size": 0 00:19:59.703 }, 00:19:59.703 { 00:19:59.703 "name": "BaseBdev4", 00:19:59.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.703 "is_configured": false, 00:19:59.703 "data_offset": 0, 00:19:59.703 "data_size": 0 00:19:59.703 } 00:19:59.703 ] 00:19:59.703 }' 00:19:59.703 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.703 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.355 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:00.355 [2024-07-12 15:56:20.675638] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:00.355 [2024-07-12 15:56:20.675668] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1131190 name Existed_Raid, state configuring 00:20:00.355 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:00.669 [2024-07-12 15:56:20.872180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:00.669 [2024-07-12 15:56:20.873291] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:00.669 [2024-07-12 15:56:20.873315] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:00.669 [2024-07-12 15:56:20.873321] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:00.669 [2024-07-12 15:56:20.873327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:00.669 [2024-07-12 15:56:20.873332] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:00.669 [2024-07-12 15:56:20.873342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:00.669 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.669 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.669 "name": "Existed_Raid", 00:20:00.669 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:00.669 "strip_size_kb": 64, 00:20:00.669 "state": "configuring", 00:20:00.669 "raid_level": "concat", 00:20:00.669 "superblock": true, 00:20:00.669 "num_base_bdevs": 4, 00:20:00.669 "num_base_bdevs_discovered": 1, 00:20:00.669 "num_base_bdevs_operational": 4, 00:20:00.669 "base_bdevs_list": [ 00:20:00.669 { 00:20:00.669 "name": "BaseBdev1", 00:20:00.669 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:00.669 "is_configured": true, 00:20:00.669 "data_offset": 2048, 00:20:00.669 "data_size": 63488 00:20:00.669 }, 00:20:00.669 { 00:20:00.669 "name": "BaseBdev2", 00:20:00.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.669 "is_configured": false, 00:20:00.669 "data_offset": 0, 00:20:00.669 "data_size": 0 00:20:00.669 }, 00:20:00.669 { 00:20:00.669 "name": "BaseBdev3", 00:20:00.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.669 "is_configured": false, 00:20:00.669 "data_offset": 0, 00:20:00.669 "data_size": 0 00:20:00.669 }, 00:20:00.669 { 00:20:00.669 "name": "BaseBdev4", 00:20:00.669 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.669 "is_configured": false, 00:20:00.669 "data_offset": 0, 00:20:00.669 "data_size": 0 00:20:00.669 } 00:20:00.669 ] 00:20:00.669 }' 00:20:00.669 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.669 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:01.239 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:01.499 [2024-07-12 15:56:21.811328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:01.499 BaseBdev2 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:01.499 15:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:01.759 [ 00:20:01.759 { 00:20:01.759 "name": "BaseBdev2", 00:20:01.759 "aliases": [ 00:20:01.759 "6f18a0e2-d571-405b-9599-96024063f654" 00:20:01.759 ], 00:20:01.759 "product_name": "Malloc disk", 00:20:01.759 "block_size": 512, 00:20:01.759 "num_blocks": 65536, 00:20:01.759 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:01.759 "assigned_rate_limits": { 00:20:01.759 "rw_ios_per_sec": 0, 00:20:01.759 "rw_mbytes_per_sec": 0, 00:20:01.759 "r_mbytes_per_sec": 0, 00:20:01.759 "w_mbytes_per_sec": 0 00:20:01.759 }, 00:20:01.759 "claimed": true, 00:20:01.759 "claim_type": "exclusive_write", 00:20:01.759 "zoned": false, 00:20:01.759 "supported_io_types": { 00:20:01.759 "read": true, 00:20:01.759 "write": true, 00:20:01.759 "unmap": true, 00:20:01.759 "flush": true, 00:20:01.759 "reset": true, 00:20:01.759 "nvme_admin": false, 00:20:01.759 "nvme_io": false, 00:20:01.759 "nvme_io_md": false, 00:20:01.759 "write_zeroes": true, 00:20:01.759 "zcopy": true, 00:20:01.759 "get_zone_info": false, 00:20:01.759 "zone_management": false, 00:20:01.759 "zone_append": false, 00:20:01.759 "compare": false, 00:20:01.759 "compare_and_write": false, 00:20:01.759 "abort": true, 00:20:01.759 "seek_hole": false, 00:20:01.759 "seek_data": false, 00:20:01.759 "copy": true, 00:20:01.759 "nvme_iov_md": false 00:20:01.759 }, 00:20:01.759 "memory_domains": [ 00:20:01.759 { 00:20:01.759 "dma_device_id": "system", 00:20:01.759 "dma_device_type": 1 00:20:01.759 }, 00:20:01.759 { 00:20:01.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:01.759 "dma_device_type": 2 00:20:01.759 } 00:20:01.759 ], 00:20:01.759 "driver_specific": {} 00:20:01.759 } 00:20:01.759 ] 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.759 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.020 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.020 "name": "Existed_Raid", 00:20:02.020 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:02.020 "strip_size_kb": 64, 00:20:02.020 "state": "configuring", 00:20:02.020 "raid_level": "concat", 00:20:02.020 "superblock": true, 00:20:02.020 "num_base_bdevs": 4, 00:20:02.020 "num_base_bdevs_discovered": 2, 00:20:02.020 "num_base_bdevs_operational": 4, 00:20:02.020 "base_bdevs_list": [ 00:20:02.020 { 00:20:02.020 "name": "BaseBdev1", 00:20:02.020 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:02.020 "is_configured": true, 00:20:02.020 "data_offset": 2048, 00:20:02.020 "data_size": 63488 00:20:02.020 }, 00:20:02.020 { 00:20:02.020 "name": "BaseBdev2", 00:20:02.020 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:02.020 "is_configured": true, 00:20:02.020 "data_offset": 2048, 00:20:02.020 "data_size": 63488 00:20:02.020 }, 00:20:02.020 { 00:20:02.020 "name": "BaseBdev3", 00:20:02.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.020 "is_configured": false, 00:20:02.020 "data_offset": 0, 00:20:02.020 "data_size": 0 00:20:02.020 }, 00:20:02.020 { 00:20:02.020 "name": "BaseBdev4", 00:20:02.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.020 "is_configured": false, 00:20:02.020 "data_offset": 0, 00:20:02.020 "data_size": 0 00:20:02.020 } 00:20:02.020 ] 00:20:02.020 }' 00:20:02.020 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.020 15:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:02.589 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:02.848 [2024-07-12 15:56:23.115561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:02.848 BaseBdev3 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.848 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:03.107 [ 00:20:03.107 { 00:20:03.107 "name": "BaseBdev3", 00:20:03.107 "aliases": [ 00:20:03.107 "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3" 00:20:03.107 ], 00:20:03.107 "product_name": "Malloc disk", 00:20:03.107 "block_size": 512, 00:20:03.107 "num_blocks": 65536, 00:20:03.107 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:03.107 "assigned_rate_limits": { 00:20:03.107 "rw_ios_per_sec": 0, 00:20:03.107 "rw_mbytes_per_sec": 0, 00:20:03.107 "r_mbytes_per_sec": 0, 00:20:03.107 "w_mbytes_per_sec": 0 00:20:03.107 }, 00:20:03.107 "claimed": true, 00:20:03.107 "claim_type": "exclusive_write", 00:20:03.107 "zoned": false, 00:20:03.107 "supported_io_types": { 00:20:03.107 "read": true, 00:20:03.107 "write": true, 00:20:03.107 "unmap": true, 00:20:03.107 "flush": true, 00:20:03.107 "reset": true, 00:20:03.107 "nvme_admin": false, 00:20:03.107 "nvme_io": false, 00:20:03.107 "nvme_io_md": false, 00:20:03.107 "write_zeroes": true, 00:20:03.107 "zcopy": true, 00:20:03.107 "get_zone_info": false, 00:20:03.107 "zone_management": false, 00:20:03.107 "zone_append": false, 00:20:03.107 "compare": false, 00:20:03.107 "compare_and_write": false, 00:20:03.107 "abort": true, 00:20:03.107 "seek_hole": false, 00:20:03.107 "seek_data": false, 00:20:03.107 "copy": true, 00:20:03.107 "nvme_iov_md": false 00:20:03.107 }, 00:20:03.107 "memory_domains": [ 00:20:03.107 { 00:20:03.107 "dma_device_id": "system", 00:20:03.107 "dma_device_type": 1 00:20:03.107 }, 00:20:03.107 { 00:20:03.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.107 "dma_device_type": 2 00:20:03.107 } 00:20:03.107 ], 00:20:03.107 "driver_specific": {} 00:20:03.107 } 00:20:03.107 ] 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.107 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:03.367 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:03.367 "name": "Existed_Raid", 00:20:03.367 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:03.367 "strip_size_kb": 64, 00:20:03.367 "state": "configuring", 00:20:03.367 "raid_level": "concat", 00:20:03.367 "superblock": true, 00:20:03.367 "num_base_bdevs": 4, 00:20:03.367 "num_base_bdevs_discovered": 3, 00:20:03.367 "num_base_bdevs_operational": 4, 00:20:03.367 "base_bdevs_list": [ 00:20:03.367 { 00:20:03.367 "name": "BaseBdev1", 00:20:03.367 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:03.367 "is_configured": true, 00:20:03.367 "data_offset": 2048, 00:20:03.367 "data_size": 63488 00:20:03.367 }, 00:20:03.367 { 00:20:03.367 "name": "BaseBdev2", 00:20:03.367 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:03.367 "is_configured": true, 00:20:03.367 "data_offset": 2048, 00:20:03.367 "data_size": 63488 00:20:03.367 }, 00:20:03.367 { 00:20:03.367 "name": "BaseBdev3", 00:20:03.367 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:03.367 "is_configured": true, 00:20:03.367 "data_offset": 2048, 00:20:03.367 "data_size": 63488 00:20:03.367 }, 00:20:03.367 { 00:20:03.367 "name": "BaseBdev4", 00:20:03.367 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:03.367 "is_configured": false, 00:20:03.367 "data_offset": 0, 00:20:03.367 "data_size": 0 00:20:03.367 } 00:20:03.367 ] 00:20:03.367 }' 00:20:03.367 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:03.367 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:03.939 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:04.200 [2024-07-12 15:56:24.407638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:04.200 [2024-07-12 15:56:24.407768] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11321d0 00:20:04.200 [2024-07-12 15:56:24.407777] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:04.200 [2024-07-12 15:56:24.407915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1133220 00:20:04.200 [2024-07-12 15:56:24.408008] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11321d0 00:20:04.200 [2024-07-12 15:56:24.408014] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11321d0 00:20:04.200 [2024-07-12 15:56:24.408080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:04.200 BaseBdev4 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:04.200 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:04.460 [ 00:20:04.460 { 00:20:04.460 "name": "BaseBdev4", 00:20:04.460 "aliases": [ 00:20:04.460 "36274013-90e9-4b5f-8504-1ec7d9c9d7e9" 00:20:04.460 ], 00:20:04.460 "product_name": "Malloc disk", 00:20:04.460 "block_size": 512, 00:20:04.460 "num_blocks": 65536, 00:20:04.460 "uuid": "36274013-90e9-4b5f-8504-1ec7d9c9d7e9", 00:20:04.460 "assigned_rate_limits": { 00:20:04.460 "rw_ios_per_sec": 0, 00:20:04.460 "rw_mbytes_per_sec": 0, 00:20:04.460 "r_mbytes_per_sec": 0, 00:20:04.460 "w_mbytes_per_sec": 0 00:20:04.460 }, 00:20:04.460 "claimed": true, 00:20:04.460 "claim_type": "exclusive_write", 00:20:04.460 "zoned": false, 00:20:04.460 "supported_io_types": { 00:20:04.460 "read": true, 00:20:04.460 "write": true, 00:20:04.460 "unmap": true, 00:20:04.460 "flush": true, 00:20:04.460 "reset": true, 00:20:04.460 "nvme_admin": false, 00:20:04.460 "nvme_io": false, 00:20:04.460 "nvme_io_md": false, 00:20:04.460 "write_zeroes": true, 00:20:04.460 "zcopy": true, 00:20:04.460 "get_zone_info": false, 00:20:04.461 "zone_management": false, 00:20:04.461 "zone_append": false, 00:20:04.461 "compare": false, 00:20:04.461 "compare_and_write": false, 00:20:04.461 "abort": true, 00:20:04.461 "seek_hole": false, 00:20:04.461 "seek_data": false, 00:20:04.461 "copy": true, 00:20:04.461 "nvme_iov_md": false 00:20:04.461 }, 00:20:04.461 "memory_domains": [ 00:20:04.461 { 00:20:04.461 "dma_device_id": "system", 00:20:04.461 "dma_device_type": 1 00:20:04.461 }, 00:20:04.461 { 00:20:04.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.461 "dma_device_type": 2 00:20:04.461 } 00:20:04.461 ], 00:20:04.461 "driver_specific": {} 00:20:04.461 } 00:20:04.461 ] 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.461 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.721 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.721 "name": "Existed_Raid", 00:20:04.721 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:04.721 "strip_size_kb": 64, 00:20:04.721 "state": "online", 00:20:04.721 "raid_level": "concat", 00:20:04.721 "superblock": true, 00:20:04.721 "num_base_bdevs": 4, 00:20:04.721 "num_base_bdevs_discovered": 4, 00:20:04.721 "num_base_bdevs_operational": 4, 00:20:04.721 "base_bdevs_list": [ 00:20:04.721 { 00:20:04.721 "name": "BaseBdev1", 00:20:04.721 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:04.721 "is_configured": true, 00:20:04.721 "data_offset": 2048, 00:20:04.721 "data_size": 63488 00:20:04.721 }, 00:20:04.721 { 00:20:04.721 "name": "BaseBdev2", 00:20:04.721 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:04.721 "is_configured": true, 00:20:04.721 "data_offset": 2048, 00:20:04.721 "data_size": 63488 00:20:04.721 }, 00:20:04.721 { 00:20:04.721 "name": "BaseBdev3", 00:20:04.721 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:04.721 "is_configured": true, 00:20:04.721 "data_offset": 2048, 00:20:04.721 "data_size": 63488 00:20:04.721 }, 00:20:04.721 { 00:20:04.721 "name": "BaseBdev4", 00:20:04.721 "uuid": "36274013-90e9-4b5f-8504-1ec7d9c9d7e9", 00:20:04.721 "is_configured": true, 00:20:04.721 "data_offset": 2048, 00:20:04.721 "data_size": 63488 00:20:04.721 } 00:20:04.721 ] 00:20:04.721 }' 00:20:04.721 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.721 15:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:05.291 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:05.291 [2024-07-12 15:56:25.719243] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:05.551 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:05.551 "name": "Existed_Raid", 00:20:05.551 "aliases": [ 00:20:05.551 "04530dc0-6c77-4093-b74c-1816f5479809" 00:20:05.551 ], 00:20:05.551 "product_name": "Raid Volume", 00:20:05.551 "block_size": 512, 00:20:05.551 "num_blocks": 253952, 00:20:05.551 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:05.551 "assigned_rate_limits": { 00:20:05.551 "rw_ios_per_sec": 0, 00:20:05.551 "rw_mbytes_per_sec": 0, 00:20:05.551 "r_mbytes_per_sec": 0, 00:20:05.551 "w_mbytes_per_sec": 0 00:20:05.551 }, 00:20:05.551 "claimed": false, 00:20:05.551 "zoned": false, 00:20:05.551 "supported_io_types": { 00:20:05.551 "read": true, 00:20:05.551 "write": true, 00:20:05.551 "unmap": true, 00:20:05.551 "flush": true, 00:20:05.551 "reset": true, 00:20:05.551 "nvme_admin": false, 00:20:05.551 "nvme_io": false, 00:20:05.551 "nvme_io_md": false, 00:20:05.551 "write_zeroes": true, 00:20:05.551 "zcopy": false, 00:20:05.551 "get_zone_info": false, 00:20:05.551 "zone_management": false, 00:20:05.551 "zone_append": false, 00:20:05.551 "compare": false, 00:20:05.551 "compare_and_write": false, 00:20:05.551 "abort": false, 00:20:05.551 "seek_hole": false, 00:20:05.551 "seek_data": false, 00:20:05.551 "copy": false, 00:20:05.551 "nvme_iov_md": false 00:20:05.551 }, 00:20:05.551 "memory_domains": [ 00:20:05.551 { 00:20:05.551 "dma_device_id": "system", 00:20:05.551 "dma_device_type": 1 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.551 "dma_device_type": 2 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "system", 00:20:05.551 "dma_device_type": 1 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.551 "dma_device_type": 2 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "system", 00:20:05.551 "dma_device_type": 1 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.551 "dma_device_type": 2 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "system", 00:20:05.551 "dma_device_type": 1 00:20:05.551 }, 00:20:05.551 { 00:20:05.551 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.551 "dma_device_type": 2 00:20:05.551 } 00:20:05.551 ], 00:20:05.551 "driver_specific": { 00:20:05.551 "raid": { 00:20:05.551 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:05.551 "strip_size_kb": 64, 00:20:05.551 "state": "online", 00:20:05.551 "raid_level": "concat", 00:20:05.551 "superblock": true, 00:20:05.551 "num_base_bdevs": 4, 00:20:05.551 "num_base_bdevs_discovered": 4, 00:20:05.551 "num_base_bdevs_operational": 4, 00:20:05.551 "base_bdevs_list": [ 00:20:05.551 { 00:20:05.551 "name": "BaseBdev1", 00:20:05.551 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:05.552 "is_configured": true, 00:20:05.552 "data_offset": 2048, 00:20:05.552 "data_size": 63488 00:20:05.552 }, 00:20:05.552 { 00:20:05.552 "name": "BaseBdev2", 00:20:05.552 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:05.552 "is_configured": true, 00:20:05.552 "data_offset": 2048, 00:20:05.552 "data_size": 63488 00:20:05.552 }, 00:20:05.552 { 00:20:05.552 "name": "BaseBdev3", 00:20:05.552 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:05.552 "is_configured": true, 00:20:05.552 "data_offset": 2048, 00:20:05.552 "data_size": 63488 00:20:05.552 }, 00:20:05.552 { 00:20:05.552 "name": "BaseBdev4", 00:20:05.552 "uuid": "36274013-90e9-4b5f-8504-1ec7d9c9d7e9", 00:20:05.552 "is_configured": true, 00:20:05.552 "data_offset": 2048, 00:20:05.552 "data_size": 63488 00:20:05.552 } 00:20:05.552 ] 00:20:05.552 } 00:20:05.552 } 00:20:05.552 }' 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:05.552 BaseBdev2 00:20:05.552 BaseBdev3 00:20:05.552 BaseBdev4' 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:05.552 "name": "BaseBdev1", 00:20:05.552 "aliases": [ 00:20:05.552 "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3" 00:20:05.552 ], 00:20:05.552 "product_name": "Malloc disk", 00:20:05.552 "block_size": 512, 00:20:05.552 "num_blocks": 65536, 00:20:05.552 "uuid": "aed6f78c-32b2-4a9e-ba19-92c55cb8cef3", 00:20:05.552 "assigned_rate_limits": { 00:20:05.552 "rw_ios_per_sec": 0, 00:20:05.552 "rw_mbytes_per_sec": 0, 00:20:05.552 "r_mbytes_per_sec": 0, 00:20:05.552 "w_mbytes_per_sec": 0 00:20:05.552 }, 00:20:05.552 "claimed": true, 00:20:05.552 "claim_type": "exclusive_write", 00:20:05.552 "zoned": false, 00:20:05.552 "supported_io_types": { 00:20:05.552 "read": true, 00:20:05.552 "write": true, 00:20:05.552 "unmap": true, 00:20:05.552 "flush": true, 00:20:05.552 "reset": true, 00:20:05.552 "nvme_admin": false, 00:20:05.552 "nvme_io": false, 00:20:05.552 "nvme_io_md": false, 00:20:05.552 "write_zeroes": true, 00:20:05.552 "zcopy": true, 00:20:05.552 "get_zone_info": false, 00:20:05.552 "zone_management": false, 00:20:05.552 "zone_append": false, 00:20:05.552 "compare": false, 00:20:05.552 "compare_and_write": false, 00:20:05.552 "abort": true, 00:20:05.552 "seek_hole": false, 00:20:05.552 "seek_data": false, 00:20:05.552 "copy": true, 00:20:05.552 "nvme_iov_md": false 00:20:05.552 }, 00:20:05.552 "memory_domains": [ 00:20:05.552 { 00:20:05.552 "dma_device_id": "system", 00:20:05.552 "dma_device_type": 1 00:20:05.552 }, 00:20:05.552 { 00:20:05.552 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.552 "dma_device_type": 2 00:20:05.552 } 00:20:05.552 ], 00:20:05.552 "driver_specific": {} 00:20:05.552 }' 00:20:05.552 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.811 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:06.071 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:06.071 "name": "BaseBdev2", 00:20:06.071 "aliases": [ 00:20:06.071 "6f18a0e2-d571-405b-9599-96024063f654" 00:20:06.071 ], 00:20:06.071 "product_name": "Malloc disk", 00:20:06.071 "block_size": 512, 00:20:06.071 "num_blocks": 65536, 00:20:06.071 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:06.071 "assigned_rate_limits": { 00:20:06.071 "rw_ios_per_sec": 0, 00:20:06.071 "rw_mbytes_per_sec": 0, 00:20:06.071 "r_mbytes_per_sec": 0, 00:20:06.071 "w_mbytes_per_sec": 0 00:20:06.071 }, 00:20:06.071 "claimed": true, 00:20:06.071 "claim_type": "exclusive_write", 00:20:06.071 "zoned": false, 00:20:06.071 "supported_io_types": { 00:20:06.071 "read": true, 00:20:06.071 "write": true, 00:20:06.071 "unmap": true, 00:20:06.071 "flush": true, 00:20:06.071 "reset": true, 00:20:06.071 "nvme_admin": false, 00:20:06.071 "nvme_io": false, 00:20:06.071 "nvme_io_md": false, 00:20:06.071 "write_zeroes": true, 00:20:06.071 "zcopy": true, 00:20:06.071 "get_zone_info": false, 00:20:06.071 "zone_management": false, 00:20:06.071 "zone_append": false, 00:20:06.071 "compare": false, 00:20:06.071 "compare_and_write": false, 00:20:06.071 "abort": true, 00:20:06.071 "seek_hole": false, 00:20:06.071 "seek_data": false, 00:20:06.071 "copy": true, 00:20:06.071 "nvme_iov_md": false 00:20:06.071 }, 00:20:06.071 "memory_domains": [ 00:20:06.071 { 00:20:06.072 "dma_device_id": "system", 00:20:06.072 "dma_device_type": 1 00:20:06.072 }, 00:20:06.072 { 00:20:06.072 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.072 "dma_device_type": 2 00:20:06.072 } 00:20:06.072 ], 00:20:06.072 "driver_specific": {} 00:20:06.072 }' 00:20:06.072 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.331 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:06.592 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:06.592 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:06.592 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:06.592 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:06.592 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:06.592 "name": "BaseBdev3", 00:20:06.592 "aliases": [ 00:20:06.592 "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3" 00:20:06.592 ], 00:20:06.592 "product_name": "Malloc disk", 00:20:06.592 "block_size": 512, 00:20:06.592 "num_blocks": 65536, 00:20:06.592 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:06.592 "assigned_rate_limits": { 00:20:06.592 "rw_ios_per_sec": 0, 00:20:06.592 "rw_mbytes_per_sec": 0, 00:20:06.592 "r_mbytes_per_sec": 0, 00:20:06.592 "w_mbytes_per_sec": 0 00:20:06.592 }, 00:20:06.592 "claimed": true, 00:20:06.592 "claim_type": "exclusive_write", 00:20:06.592 "zoned": false, 00:20:06.592 "supported_io_types": { 00:20:06.592 "read": true, 00:20:06.592 "write": true, 00:20:06.592 "unmap": true, 00:20:06.592 "flush": true, 00:20:06.592 "reset": true, 00:20:06.592 "nvme_admin": false, 00:20:06.592 "nvme_io": false, 00:20:06.592 "nvme_io_md": false, 00:20:06.592 "write_zeroes": true, 00:20:06.592 "zcopy": true, 00:20:06.592 "get_zone_info": false, 00:20:06.592 "zone_management": false, 00:20:06.592 "zone_append": false, 00:20:06.592 "compare": false, 00:20:06.592 "compare_and_write": false, 00:20:06.592 "abort": true, 00:20:06.592 "seek_hole": false, 00:20:06.592 "seek_data": false, 00:20:06.592 "copy": true, 00:20:06.592 "nvme_iov_md": false 00:20:06.592 }, 00:20:06.592 "memory_domains": [ 00:20:06.592 { 00:20:06.592 "dma_device_id": "system", 00:20:06.592 "dma_device_type": 1 00:20:06.592 }, 00:20:06.592 { 00:20:06.592 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:06.592 "dma_device_type": 2 00:20:06.592 } 00:20:06.592 ], 00:20:06.592 "driver_specific": {} 00:20:06.592 }' 00:20:06.592 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:06.852 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:07.111 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:07.111 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:07.111 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:07.111 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:07.111 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:07.371 "name": "BaseBdev4", 00:20:07.371 "aliases": [ 00:20:07.371 "36274013-90e9-4b5f-8504-1ec7d9c9d7e9" 00:20:07.371 ], 00:20:07.371 "product_name": "Malloc disk", 00:20:07.371 "block_size": 512, 00:20:07.371 "num_blocks": 65536, 00:20:07.371 "uuid": "36274013-90e9-4b5f-8504-1ec7d9c9d7e9", 00:20:07.371 "assigned_rate_limits": { 00:20:07.371 "rw_ios_per_sec": 0, 00:20:07.371 "rw_mbytes_per_sec": 0, 00:20:07.371 "r_mbytes_per_sec": 0, 00:20:07.371 "w_mbytes_per_sec": 0 00:20:07.371 }, 00:20:07.371 "claimed": true, 00:20:07.371 "claim_type": "exclusive_write", 00:20:07.371 "zoned": false, 00:20:07.371 "supported_io_types": { 00:20:07.371 "read": true, 00:20:07.371 "write": true, 00:20:07.371 "unmap": true, 00:20:07.371 "flush": true, 00:20:07.371 "reset": true, 00:20:07.371 "nvme_admin": false, 00:20:07.371 "nvme_io": false, 00:20:07.371 "nvme_io_md": false, 00:20:07.371 "write_zeroes": true, 00:20:07.371 "zcopy": true, 00:20:07.371 "get_zone_info": false, 00:20:07.371 "zone_management": false, 00:20:07.371 "zone_append": false, 00:20:07.371 "compare": false, 00:20:07.371 "compare_and_write": false, 00:20:07.371 "abort": true, 00:20:07.371 "seek_hole": false, 00:20:07.371 "seek_data": false, 00:20:07.371 "copy": true, 00:20:07.371 "nvme_iov_md": false 00:20:07.371 }, 00:20:07.371 "memory_domains": [ 00:20:07.371 { 00:20:07.371 "dma_device_id": "system", 00:20:07.371 "dma_device_type": 1 00:20:07.371 }, 00:20:07.371 { 00:20:07.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.371 "dma_device_type": 2 00:20:07.371 } 00:20:07.371 ], 00:20:07.371 "driver_specific": {} 00:20:07.371 }' 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:07.371 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:07.630 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:07.630 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:07.630 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:07.630 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:07.630 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:07.890 [2024-07-12 15:56:28.121094] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:07.890 [2024-07-12 15:56:28.121111] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:07.890 [2024-07-12 15:56:28.121146] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.890 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.149 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.149 "name": "Existed_Raid", 00:20:08.149 "uuid": "04530dc0-6c77-4093-b74c-1816f5479809", 00:20:08.149 "strip_size_kb": 64, 00:20:08.149 "state": "offline", 00:20:08.149 "raid_level": "concat", 00:20:08.149 "superblock": true, 00:20:08.149 "num_base_bdevs": 4, 00:20:08.149 "num_base_bdevs_discovered": 3, 00:20:08.149 "num_base_bdevs_operational": 3, 00:20:08.150 "base_bdevs_list": [ 00:20:08.150 { 00:20:08.150 "name": null, 00:20:08.150 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:08.150 "is_configured": false, 00:20:08.150 "data_offset": 2048, 00:20:08.150 "data_size": 63488 00:20:08.150 }, 00:20:08.150 { 00:20:08.150 "name": "BaseBdev2", 00:20:08.150 "uuid": "6f18a0e2-d571-405b-9599-96024063f654", 00:20:08.150 "is_configured": true, 00:20:08.150 "data_offset": 2048, 00:20:08.150 "data_size": 63488 00:20:08.150 }, 00:20:08.150 { 00:20:08.150 "name": "BaseBdev3", 00:20:08.150 "uuid": "93b624a5-5cf2-4623-b6aa-5ad6a3ac27a3", 00:20:08.150 "is_configured": true, 00:20:08.150 "data_offset": 2048, 00:20:08.150 "data_size": 63488 00:20:08.150 }, 00:20:08.150 { 00:20:08.150 "name": "BaseBdev4", 00:20:08.150 "uuid": "36274013-90e9-4b5f-8504-1ec7d9c9d7e9", 00:20:08.150 "is_configured": true, 00:20:08.150 "data_offset": 2048, 00:20:08.150 "data_size": 63488 00:20:08.150 } 00:20:08.150 ] 00:20:08.150 }' 00:20:08.150 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.150 15:56:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.439 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:08.439 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:08.439 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.439 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:08.699 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:08.699 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:08.699 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:08.958 [2024-07-12 15:56:29.247940] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:08.958 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:08.958 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:08.959 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.959 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:09.218 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:09.218 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:09.218 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:09.218 [2024-07-12 15:56:29.634639] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:09.478 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:09.738 [2024-07-12 15:56:30.029437] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:09.738 [2024-07-12 15:56:30.029466] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11321d0 name Existed_Raid, state offline 00:20:09.738 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:09.738 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:09.738 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.738 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:09.997 BaseBdev2 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:09.997 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:10.257 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:10.516 [ 00:20:10.516 { 00:20:10.516 "name": "BaseBdev2", 00:20:10.516 "aliases": [ 00:20:10.516 "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7" 00:20:10.516 ], 00:20:10.516 "product_name": "Malloc disk", 00:20:10.516 "block_size": 512, 00:20:10.516 "num_blocks": 65536, 00:20:10.516 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:10.516 "assigned_rate_limits": { 00:20:10.516 "rw_ios_per_sec": 0, 00:20:10.516 "rw_mbytes_per_sec": 0, 00:20:10.516 "r_mbytes_per_sec": 0, 00:20:10.516 "w_mbytes_per_sec": 0 00:20:10.516 }, 00:20:10.516 "claimed": false, 00:20:10.516 "zoned": false, 00:20:10.516 "supported_io_types": { 00:20:10.516 "read": true, 00:20:10.516 "write": true, 00:20:10.516 "unmap": true, 00:20:10.516 "flush": true, 00:20:10.516 "reset": true, 00:20:10.516 "nvme_admin": false, 00:20:10.516 "nvme_io": false, 00:20:10.516 "nvme_io_md": false, 00:20:10.516 "write_zeroes": true, 00:20:10.516 "zcopy": true, 00:20:10.516 "get_zone_info": false, 00:20:10.516 "zone_management": false, 00:20:10.517 "zone_append": false, 00:20:10.517 "compare": false, 00:20:10.517 "compare_and_write": false, 00:20:10.517 "abort": true, 00:20:10.517 "seek_hole": false, 00:20:10.517 "seek_data": false, 00:20:10.517 "copy": true, 00:20:10.517 "nvme_iov_md": false 00:20:10.517 }, 00:20:10.517 "memory_domains": [ 00:20:10.517 { 00:20:10.517 "dma_device_id": "system", 00:20:10.517 "dma_device_type": 1 00:20:10.517 }, 00:20:10.517 { 00:20:10.517 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.517 "dma_device_type": 2 00:20:10.517 } 00:20:10.517 ], 00:20:10.517 "driver_specific": {} 00:20:10.517 } 00:20:10.517 ] 00:20:10.517 15:56:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:10.517 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:10.517 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:10.517 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:10.777 BaseBdev3 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:10.777 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:11.037 [ 00:20:11.037 { 00:20:11.037 "name": "BaseBdev3", 00:20:11.037 "aliases": [ 00:20:11.037 "026ab199-e9ca-4bdc-82a1-3a0849cac086" 00:20:11.037 ], 00:20:11.037 "product_name": "Malloc disk", 00:20:11.037 "block_size": 512, 00:20:11.037 "num_blocks": 65536, 00:20:11.037 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:11.037 "assigned_rate_limits": { 00:20:11.037 "rw_ios_per_sec": 0, 00:20:11.037 "rw_mbytes_per_sec": 0, 00:20:11.037 "r_mbytes_per_sec": 0, 00:20:11.037 "w_mbytes_per_sec": 0 00:20:11.037 }, 00:20:11.037 "claimed": false, 00:20:11.037 "zoned": false, 00:20:11.037 "supported_io_types": { 00:20:11.037 "read": true, 00:20:11.037 "write": true, 00:20:11.037 "unmap": true, 00:20:11.037 "flush": true, 00:20:11.037 "reset": true, 00:20:11.037 "nvme_admin": false, 00:20:11.037 "nvme_io": false, 00:20:11.037 "nvme_io_md": false, 00:20:11.037 "write_zeroes": true, 00:20:11.037 "zcopy": true, 00:20:11.037 "get_zone_info": false, 00:20:11.037 "zone_management": false, 00:20:11.037 "zone_append": false, 00:20:11.037 "compare": false, 00:20:11.037 "compare_and_write": false, 00:20:11.037 "abort": true, 00:20:11.037 "seek_hole": false, 00:20:11.037 "seek_data": false, 00:20:11.037 "copy": true, 00:20:11.037 "nvme_iov_md": false 00:20:11.037 }, 00:20:11.037 "memory_domains": [ 00:20:11.037 { 00:20:11.037 "dma_device_id": "system", 00:20:11.037 "dma_device_type": 1 00:20:11.037 }, 00:20:11.037 { 00:20:11.037 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.037 "dma_device_type": 2 00:20:11.037 } 00:20:11.037 ], 00:20:11.037 "driver_specific": {} 00:20:11.037 } 00:20:11.037 ] 00:20:11.037 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:11.037 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:11.037 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:11.037 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:11.296 BaseBdev4 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:11.296 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:11.556 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:11.556 [ 00:20:11.556 { 00:20:11.556 "name": "BaseBdev4", 00:20:11.556 "aliases": [ 00:20:11.556 "62cc998d-dc00-47a4-b215-8563ce05d5db" 00:20:11.556 ], 00:20:11.556 "product_name": "Malloc disk", 00:20:11.556 "block_size": 512, 00:20:11.556 "num_blocks": 65536, 00:20:11.556 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:11.556 "assigned_rate_limits": { 00:20:11.556 "rw_ios_per_sec": 0, 00:20:11.556 "rw_mbytes_per_sec": 0, 00:20:11.556 "r_mbytes_per_sec": 0, 00:20:11.556 "w_mbytes_per_sec": 0 00:20:11.556 }, 00:20:11.556 "claimed": false, 00:20:11.556 "zoned": false, 00:20:11.556 "supported_io_types": { 00:20:11.556 "read": true, 00:20:11.556 "write": true, 00:20:11.556 "unmap": true, 00:20:11.556 "flush": true, 00:20:11.556 "reset": true, 00:20:11.556 "nvme_admin": false, 00:20:11.556 "nvme_io": false, 00:20:11.556 "nvme_io_md": false, 00:20:11.556 "write_zeroes": true, 00:20:11.556 "zcopy": true, 00:20:11.556 "get_zone_info": false, 00:20:11.556 "zone_management": false, 00:20:11.556 "zone_append": false, 00:20:11.556 "compare": false, 00:20:11.556 "compare_and_write": false, 00:20:11.556 "abort": true, 00:20:11.556 "seek_hole": false, 00:20:11.556 "seek_data": false, 00:20:11.556 "copy": true, 00:20:11.556 "nvme_iov_md": false 00:20:11.556 }, 00:20:11.556 "memory_domains": [ 00:20:11.556 { 00:20:11.556 "dma_device_id": "system", 00:20:11.556 "dma_device_type": 1 00:20:11.556 }, 00:20:11.556 { 00:20:11.556 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.556 "dma_device_type": 2 00:20:11.556 } 00:20:11.556 ], 00:20:11.556 "driver_specific": {} 00:20:11.556 } 00:20:11.556 ] 00:20:11.556 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:11.556 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:11.556 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:11.556 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:11.817 [2024-07-12 15:56:32.152604] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:11.817 [2024-07-12 15:56:32.152631] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:11.817 [2024-07-12 15:56:32.152644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:11.817 [2024-07-12 15:56:32.153670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:11.817 [2024-07-12 15:56:32.153701] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.817 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.076 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.076 "name": "Existed_Raid", 00:20:12.076 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:12.076 "strip_size_kb": 64, 00:20:12.076 "state": "configuring", 00:20:12.076 "raid_level": "concat", 00:20:12.076 "superblock": true, 00:20:12.076 "num_base_bdevs": 4, 00:20:12.076 "num_base_bdevs_discovered": 3, 00:20:12.076 "num_base_bdevs_operational": 4, 00:20:12.076 "base_bdevs_list": [ 00:20:12.076 { 00:20:12.076 "name": "BaseBdev1", 00:20:12.076 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.076 "is_configured": false, 00:20:12.076 "data_offset": 0, 00:20:12.076 "data_size": 0 00:20:12.076 }, 00:20:12.076 { 00:20:12.076 "name": "BaseBdev2", 00:20:12.077 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:12.077 "is_configured": true, 00:20:12.077 "data_offset": 2048, 00:20:12.077 "data_size": 63488 00:20:12.077 }, 00:20:12.077 { 00:20:12.077 "name": "BaseBdev3", 00:20:12.077 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:12.077 "is_configured": true, 00:20:12.077 "data_offset": 2048, 00:20:12.077 "data_size": 63488 00:20:12.077 }, 00:20:12.077 { 00:20:12.077 "name": "BaseBdev4", 00:20:12.077 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:12.077 "is_configured": true, 00:20:12.077 "data_offset": 2048, 00:20:12.077 "data_size": 63488 00:20:12.077 } 00:20:12.077 ] 00:20:12.077 }' 00:20:12.077 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.077 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:12.647 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:12.647 [2024-07-12 15:56:33.086935] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.907 "name": "Existed_Raid", 00:20:12.907 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:12.907 "strip_size_kb": 64, 00:20:12.907 "state": "configuring", 00:20:12.907 "raid_level": "concat", 00:20:12.907 "superblock": true, 00:20:12.907 "num_base_bdevs": 4, 00:20:12.907 "num_base_bdevs_discovered": 2, 00:20:12.907 "num_base_bdevs_operational": 4, 00:20:12.907 "base_bdevs_list": [ 00:20:12.907 { 00:20:12.907 "name": "BaseBdev1", 00:20:12.907 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:12.907 "is_configured": false, 00:20:12.907 "data_offset": 0, 00:20:12.907 "data_size": 0 00:20:12.907 }, 00:20:12.907 { 00:20:12.907 "name": null, 00:20:12.907 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:12.907 "is_configured": false, 00:20:12.907 "data_offset": 2048, 00:20:12.907 "data_size": 63488 00:20:12.907 }, 00:20:12.907 { 00:20:12.907 "name": "BaseBdev3", 00:20:12.907 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:12.907 "is_configured": true, 00:20:12.907 "data_offset": 2048, 00:20:12.907 "data_size": 63488 00:20:12.907 }, 00:20:12.907 { 00:20:12.907 "name": "BaseBdev4", 00:20:12.907 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:12.907 "is_configured": true, 00:20:12.907 "data_offset": 2048, 00:20:12.907 "data_size": 63488 00:20:12.907 } 00:20:12.907 ] 00:20:12.907 }' 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.907 15:56:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.478 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.478 15:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:13.737 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:13.737 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:13.998 [2024-07-12 15:56:34.222768] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:13.998 BaseBdev1 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:13.998 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:14.258 [ 00:20:14.258 { 00:20:14.258 "name": "BaseBdev1", 00:20:14.258 "aliases": [ 00:20:14.259 "b7b11e65-b8ae-487e-9fca-aa83878ec296" 00:20:14.259 ], 00:20:14.259 "product_name": "Malloc disk", 00:20:14.259 "block_size": 512, 00:20:14.259 "num_blocks": 65536, 00:20:14.259 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:14.259 "assigned_rate_limits": { 00:20:14.259 "rw_ios_per_sec": 0, 00:20:14.259 "rw_mbytes_per_sec": 0, 00:20:14.259 "r_mbytes_per_sec": 0, 00:20:14.259 "w_mbytes_per_sec": 0 00:20:14.259 }, 00:20:14.259 "claimed": true, 00:20:14.259 "claim_type": "exclusive_write", 00:20:14.259 "zoned": false, 00:20:14.259 "supported_io_types": { 00:20:14.259 "read": true, 00:20:14.259 "write": true, 00:20:14.259 "unmap": true, 00:20:14.259 "flush": true, 00:20:14.259 "reset": true, 00:20:14.259 "nvme_admin": false, 00:20:14.259 "nvme_io": false, 00:20:14.259 "nvme_io_md": false, 00:20:14.259 "write_zeroes": true, 00:20:14.259 "zcopy": true, 00:20:14.259 "get_zone_info": false, 00:20:14.259 "zone_management": false, 00:20:14.259 "zone_append": false, 00:20:14.259 "compare": false, 00:20:14.259 "compare_and_write": false, 00:20:14.259 "abort": true, 00:20:14.259 "seek_hole": false, 00:20:14.259 "seek_data": false, 00:20:14.259 "copy": true, 00:20:14.259 "nvme_iov_md": false 00:20:14.259 }, 00:20:14.259 "memory_domains": [ 00:20:14.259 { 00:20:14.259 "dma_device_id": "system", 00:20:14.259 "dma_device_type": 1 00:20:14.259 }, 00:20:14.259 { 00:20:14.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.259 "dma_device_type": 2 00:20:14.259 } 00:20:14.259 ], 00:20:14.259 "driver_specific": {} 00:20:14.259 } 00:20:14.259 ] 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.259 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:14.519 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:14.519 "name": "Existed_Raid", 00:20:14.519 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:14.519 "strip_size_kb": 64, 00:20:14.519 "state": "configuring", 00:20:14.519 "raid_level": "concat", 00:20:14.519 "superblock": true, 00:20:14.519 "num_base_bdevs": 4, 00:20:14.519 "num_base_bdevs_discovered": 3, 00:20:14.519 "num_base_bdevs_operational": 4, 00:20:14.519 "base_bdevs_list": [ 00:20:14.519 { 00:20:14.519 "name": "BaseBdev1", 00:20:14.519 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:14.519 "is_configured": true, 00:20:14.519 "data_offset": 2048, 00:20:14.519 "data_size": 63488 00:20:14.519 }, 00:20:14.519 { 00:20:14.519 "name": null, 00:20:14.519 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:14.519 "is_configured": false, 00:20:14.519 "data_offset": 2048, 00:20:14.519 "data_size": 63488 00:20:14.519 }, 00:20:14.519 { 00:20:14.519 "name": "BaseBdev3", 00:20:14.519 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:14.519 "is_configured": true, 00:20:14.519 "data_offset": 2048, 00:20:14.519 "data_size": 63488 00:20:14.519 }, 00:20:14.519 { 00:20:14.519 "name": "BaseBdev4", 00:20:14.519 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:14.519 "is_configured": true, 00:20:14.519 "data_offset": 2048, 00:20:14.519 "data_size": 63488 00:20:14.519 } 00:20:14.519 ] 00:20:14.519 }' 00:20:14.519 15:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:14.519 15:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:15.089 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.089 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:15.349 [2024-07-12 15:56:35.762678] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.349 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.609 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.609 "name": "Existed_Raid", 00:20:15.609 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:15.609 "strip_size_kb": 64, 00:20:15.609 "state": "configuring", 00:20:15.609 "raid_level": "concat", 00:20:15.609 "superblock": true, 00:20:15.609 "num_base_bdevs": 4, 00:20:15.609 "num_base_bdevs_discovered": 2, 00:20:15.609 "num_base_bdevs_operational": 4, 00:20:15.609 "base_bdevs_list": [ 00:20:15.609 { 00:20:15.609 "name": "BaseBdev1", 00:20:15.609 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:15.609 "is_configured": true, 00:20:15.609 "data_offset": 2048, 00:20:15.609 "data_size": 63488 00:20:15.609 }, 00:20:15.609 { 00:20:15.609 "name": null, 00:20:15.609 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:15.609 "is_configured": false, 00:20:15.609 "data_offset": 2048, 00:20:15.609 "data_size": 63488 00:20:15.609 }, 00:20:15.609 { 00:20:15.609 "name": null, 00:20:15.609 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:15.609 "is_configured": false, 00:20:15.609 "data_offset": 2048, 00:20:15.609 "data_size": 63488 00:20:15.609 }, 00:20:15.609 { 00:20:15.609 "name": "BaseBdev4", 00:20:15.609 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:15.609 "is_configured": true, 00:20:15.609 "data_offset": 2048, 00:20:15.609 "data_size": 63488 00:20:15.609 } 00:20:15.609 ] 00:20:15.609 }' 00:20:15.609 15:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.609 15:56:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:16.178 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.178 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:16.439 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:16.439 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:16.439 [2024-07-12 15:56:36.873502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.699 15:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.699 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.699 "name": "Existed_Raid", 00:20:16.699 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:16.699 "strip_size_kb": 64, 00:20:16.699 "state": "configuring", 00:20:16.699 "raid_level": "concat", 00:20:16.699 "superblock": true, 00:20:16.699 "num_base_bdevs": 4, 00:20:16.699 "num_base_bdevs_discovered": 3, 00:20:16.699 "num_base_bdevs_operational": 4, 00:20:16.699 "base_bdevs_list": [ 00:20:16.699 { 00:20:16.699 "name": "BaseBdev1", 00:20:16.699 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:16.699 "is_configured": true, 00:20:16.699 "data_offset": 2048, 00:20:16.699 "data_size": 63488 00:20:16.699 }, 00:20:16.699 { 00:20:16.699 "name": null, 00:20:16.699 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:16.699 "is_configured": false, 00:20:16.699 "data_offset": 2048, 00:20:16.699 "data_size": 63488 00:20:16.699 }, 00:20:16.699 { 00:20:16.699 "name": "BaseBdev3", 00:20:16.699 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:16.699 "is_configured": true, 00:20:16.699 "data_offset": 2048, 00:20:16.699 "data_size": 63488 00:20:16.699 }, 00:20:16.699 { 00:20:16.699 "name": "BaseBdev4", 00:20:16.699 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:16.699 "is_configured": true, 00:20:16.699 "data_offset": 2048, 00:20:16.699 "data_size": 63488 00:20:16.699 } 00:20:16.699 ] 00:20:16.699 }' 00:20:16.699 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.699 15:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:17.269 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.269 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:17.529 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:17.529 15:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:17.789 [2024-07-12 15:56:37.984326] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.789 "name": "Existed_Raid", 00:20:17.789 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:17.789 "strip_size_kb": 64, 00:20:17.789 "state": "configuring", 00:20:17.789 "raid_level": "concat", 00:20:17.789 "superblock": true, 00:20:17.789 "num_base_bdevs": 4, 00:20:17.789 "num_base_bdevs_discovered": 2, 00:20:17.789 "num_base_bdevs_operational": 4, 00:20:17.789 "base_bdevs_list": [ 00:20:17.789 { 00:20:17.789 "name": null, 00:20:17.789 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:17.789 "is_configured": false, 00:20:17.789 "data_offset": 2048, 00:20:17.789 "data_size": 63488 00:20:17.789 }, 00:20:17.789 { 00:20:17.789 "name": null, 00:20:17.789 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:17.789 "is_configured": false, 00:20:17.789 "data_offset": 2048, 00:20:17.789 "data_size": 63488 00:20:17.789 }, 00:20:17.789 { 00:20:17.789 "name": "BaseBdev3", 00:20:17.789 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:17.789 "is_configured": true, 00:20:17.789 "data_offset": 2048, 00:20:17.789 "data_size": 63488 00:20:17.789 }, 00:20:17.789 { 00:20:17.789 "name": "BaseBdev4", 00:20:17.789 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:17.789 "is_configured": true, 00:20:17.789 "data_offset": 2048, 00:20:17.789 "data_size": 63488 00:20:17.789 } 00:20:17.789 ] 00:20:17.789 }' 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.789 15:56:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.358 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.358 15:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:18.928 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:18.928 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:19.188 [2024-07-12 15:56:39.429823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.188 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:19.448 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.448 "name": "Existed_Raid", 00:20:19.448 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:19.448 "strip_size_kb": 64, 00:20:19.448 "state": "configuring", 00:20:19.448 "raid_level": "concat", 00:20:19.448 "superblock": true, 00:20:19.448 "num_base_bdevs": 4, 00:20:19.448 "num_base_bdevs_discovered": 3, 00:20:19.448 "num_base_bdevs_operational": 4, 00:20:19.448 "base_bdevs_list": [ 00:20:19.448 { 00:20:19.448 "name": null, 00:20:19.448 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:19.448 "is_configured": false, 00:20:19.448 "data_offset": 2048, 00:20:19.448 "data_size": 63488 00:20:19.448 }, 00:20:19.448 { 00:20:19.448 "name": "BaseBdev2", 00:20:19.448 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:19.448 "is_configured": true, 00:20:19.448 "data_offset": 2048, 00:20:19.448 "data_size": 63488 00:20:19.448 }, 00:20:19.448 { 00:20:19.448 "name": "BaseBdev3", 00:20:19.448 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:19.448 "is_configured": true, 00:20:19.448 "data_offset": 2048, 00:20:19.448 "data_size": 63488 00:20:19.448 }, 00:20:19.448 { 00:20:19.448 "name": "BaseBdev4", 00:20:19.448 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:19.448 "is_configured": true, 00:20:19.448 "data_offset": 2048, 00:20:19.448 "data_size": 63488 00:20:19.448 } 00:20:19.448 ] 00:20:19.448 }' 00:20:19.448 15:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.448 15:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:20.017 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.017 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:20.017 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:20.017 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:20.017 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:20.277 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b7b11e65-b8ae-487e-9fca-aa83878ec296 00:20:20.537 [2024-07-12 15:56:40.754126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:20.537 [2024-07-12 15:56:40.754242] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1134c90 00:20:20.537 [2024-07-12 15:56:40.754249] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:20.537 [2024-07-12 15:56:40.754387] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11318f0 00:20:20.537 [2024-07-12 15:56:40.754475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1134c90 00:20:20.537 [2024-07-12 15:56:40.754480] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1134c90 00:20:20.537 [2024-07-12 15:56:40.754547] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.537 NewBaseBdev 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:20.537 15:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:20.796 [ 00:20:20.796 { 00:20:20.796 "name": "NewBaseBdev", 00:20:20.796 "aliases": [ 00:20:20.796 "b7b11e65-b8ae-487e-9fca-aa83878ec296" 00:20:20.796 ], 00:20:20.796 "product_name": "Malloc disk", 00:20:20.796 "block_size": 512, 00:20:20.796 "num_blocks": 65536, 00:20:20.796 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:20.796 "assigned_rate_limits": { 00:20:20.796 "rw_ios_per_sec": 0, 00:20:20.796 "rw_mbytes_per_sec": 0, 00:20:20.796 "r_mbytes_per_sec": 0, 00:20:20.796 "w_mbytes_per_sec": 0 00:20:20.796 }, 00:20:20.796 "claimed": true, 00:20:20.796 "claim_type": "exclusive_write", 00:20:20.796 "zoned": false, 00:20:20.796 "supported_io_types": { 00:20:20.796 "read": true, 00:20:20.796 "write": true, 00:20:20.797 "unmap": true, 00:20:20.797 "flush": true, 00:20:20.797 "reset": true, 00:20:20.797 "nvme_admin": false, 00:20:20.797 "nvme_io": false, 00:20:20.797 "nvme_io_md": false, 00:20:20.797 "write_zeroes": true, 00:20:20.797 "zcopy": true, 00:20:20.797 "get_zone_info": false, 00:20:20.797 "zone_management": false, 00:20:20.797 "zone_append": false, 00:20:20.797 "compare": false, 00:20:20.797 "compare_and_write": false, 00:20:20.797 "abort": true, 00:20:20.797 "seek_hole": false, 00:20:20.797 "seek_data": false, 00:20:20.797 "copy": true, 00:20:20.797 "nvme_iov_md": false 00:20:20.797 }, 00:20:20.797 "memory_domains": [ 00:20:20.797 { 00:20:20.797 "dma_device_id": "system", 00:20:20.797 "dma_device_type": 1 00:20:20.797 }, 00:20:20.797 { 00:20:20.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:20.797 "dma_device_type": 2 00:20:20.797 } 00:20:20.797 ], 00:20:20.797 "driver_specific": {} 00:20:20.797 } 00:20:20.797 ] 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.797 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.056 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.056 "name": "Existed_Raid", 00:20:21.056 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:21.056 "strip_size_kb": 64, 00:20:21.056 "state": "online", 00:20:21.056 "raid_level": "concat", 00:20:21.056 "superblock": true, 00:20:21.056 "num_base_bdevs": 4, 00:20:21.056 "num_base_bdevs_discovered": 4, 00:20:21.056 "num_base_bdevs_operational": 4, 00:20:21.056 "base_bdevs_list": [ 00:20:21.056 { 00:20:21.056 "name": "NewBaseBdev", 00:20:21.056 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:21.056 "is_configured": true, 00:20:21.056 "data_offset": 2048, 00:20:21.056 "data_size": 63488 00:20:21.056 }, 00:20:21.056 { 00:20:21.056 "name": "BaseBdev2", 00:20:21.056 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:21.056 "is_configured": true, 00:20:21.056 "data_offset": 2048, 00:20:21.056 "data_size": 63488 00:20:21.056 }, 00:20:21.056 { 00:20:21.056 "name": "BaseBdev3", 00:20:21.056 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:21.056 "is_configured": true, 00:20:21.056 "data_offset": 2048, 00:20:21.056 "data_size": 63488 00:20:21.056 }, 00:20:21.056 { 00:20:21.056 "name": "BaseBdev4", 00:20:21.056 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:21.056 "is_configured": true, 00:20:21.056 "data_offset": 2048, 00:20:21.056 "data_size": 63488 00:20:21.056 } 00:20:21.056 ] 00:20:21.056 }' 00:20:21.056 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.056 15:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:21.627 15:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:21.627 [2024-07-12 15:56:42.033620] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:21.627 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:21.627 "name": "Existed_Raid", 00:20:21.627 "aliases": [ 00:20:21.627 "70659b49-fab2-4416-a7bd-6a96cb2a34cf" 00:20:21.627 ], 00:20:21.627 "product_name": "Raid Volume", 00:20:21.627 "block_size": 512, 00:20:21.627 "num_blocks": 253952, 00:20:21.627 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:21.627 "assigned_rate_limits": { 00:20:21.627 "rw_ios_per_sec": 0, 00:20:21.627 "rw_mbytes_per_sec": 0, 00:20:21.627 "r_mbytes_per_sec": 0, 00:20:21.627 "w_mbytes_per_sec": 0 00:20:21.627 }, 00:20:21.627 "claimed": false, 00:20:21.627 "zoned": false, 00:20:21.627 "supported_io_types": { 00:20:21.627 "read": true, 00:20:21.627 "write": true, 00:20:21.627 "unmap": true, 00:20:21.627 "flush": true, 00:20:21.627 "reset": true, 00:20:21.627 "nvme_admin": false, 00:20:21.627 "nvme_io": false, 00:20:21.627 "nvme_io_md": false, 00:20:21.627 "write_zeroes": true, 00:20:21.627 "zcopy": false, 00:20:21.627 "get_zone_info": false, 00:20:21.627 "zone_management": false, 00:20:21.627 "zone_append": false, 00:20:21.627 "compare": false, 00:20:21.627 "compare_and_write": false, 00:20:21.627 "abort": false, 00:20:21.627 "seek_hole": false, 00:20:21.627 "seek_data": false, 00:20:21.627 "copy": false, 00:20:21.627 "nvme_iov_md": false 00:20:21.627 }, 00:20:21.627 "memory_domains": [ 00:20:21.627 { 00:20:21.627 "dma_device_id": "system", 00:20:21.627 "dma_device_type": 1 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.627 "dma_device_type": 2 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "system", 00:20:21.627 "dma_device_type": 1 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.627 "dma_device_type": 2 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "system", 00:20:21.627 "dma_device_type": 1 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.627 "dma_device_type": 2 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "system", 00:20:21.627 "dma_device_type": 1 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.627 "dma_device_type": 2 00:20:21.627 } 00:20:21.627 ], 00:20:21.627 "driver_specific": { 00:20:21.627 "raid": { 00:20:21.627 "uuid": "70659b49-fab2-4416-a7bd-6a96cb2a34cf", 00:20:21.627 "strip_size_kb": 64, 00:20:21.627 "state": "online", 00:20:21.627 "raid_level": "concat", 00:20:21.627 "superblock": true, 00:20:21.627 "num_base_bdevs": 4, 00:20:21.627 "num_base_bdevs_discovered": 4, 00:20:21.627 "num_base_bdevs_operational": 4, 00:20:21.627 "base_bdevs_list": [ 00:20:21.627 { 00:20:21.627 "name": "NewBaseBdev", 00:20:21.627 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:21.627 "is_configured": true, 00:20:21.627 "data_offset": 2048, 00:20:21.627 "data_size": 63488 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "name": "BaseBdev2", 00:20:21.627 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:21.627 "is_configured": true, 00:20:21.627 "data_offset": 2048, 00:20:21.627 "data_size": 63488 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "name": "BaseBdev3", 00:20:21.627 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:21.627 "is_configured": true, 00:20:21.627 "data_offset": 2048, 00:20:21.627 "data_size": 63488 00:20:21.627 }, 00:20:21.627 { 00:20:21.627 "name": "BaseBdev4", 00:20:21.627 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:21.627 "is_configured": true, 00:20:21.627 "data_offset": 2048, 00:20:21.627 "data_size": 63488 00:20:21.627 } 00:20:21.627 ] 00:20:21.627 } 00:20:21.627 } 00:20:21.627 }' 00:20:21.627 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:21.887 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:21.887 BaseBdev2 00:20:21.888 BaseBdev3 00:20:21.888 BaseBdev4' 00:20:21.888 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:21.888 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:21.888 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:21.888 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:21.888 "name": "NewBaseBdev", 00:20:21.888 "aliases": [ 00:20:21.888 "b7b11e65-b8ae-487e-9fca-aa83878ec296" 00:20:21.888 ], 00:20:21.888 "product_name": "Malloc disk", 00:20:21.888 "block_size": 512, 00:20:21.888 "num_blocks": 65536, 00:20:21.888 "uuid": "b7b11e65-b8ae-487e-9fca-aa83878ec296", 00:20:21.888 "assigned_rate_limits": { 00:20:21.888 "rw_ios_per_sec": 0, 00:20:21.888 "rw_mbytes_per_sec": 0, 00:20:21.888 "r_mbytes_per_sec": 0, 00:20:21.888 "w_mbytes_per_sec": 0 00:20:21.888 }, 00:20:21.888 "claimed": true, 00:20:21.888 "claim_type": "exclusive_write", 00:20:21.888 "zoned": false, 00:20:21.888 "supported_io_types": { 00:20:21.888 "read": true, 00:20:21.888 "write": true, 00:20:21.888 "unmap": true, 00:20:21.888 "flush": true, 00:20:21.888 "reset": true, 00:20:21.888 "nvme_admin": false, 00:20:21.888 "nvme_io": false, 00:20:21.888 "nvme_io_md": false, 00:20:21.888 "write_zeroes": true, 00:20:21.888 "zcopy": true, 00:20:21.888 "get_zone_info": false, 00:20:21.888 "zone_management": false, 00:20:21.888 "zone_append": false, 00:20:21.888 "compare": false, 00:20:21.888 "compare_and_write": false, 00:20:21.888 "abort": true, 00:20:21.888 "seek_hole": false, 00:20:21.888 "seek_data": false, 00:20:21.888 "copy": true, 00:20:21.888 "nvme_iov_md": false 00:20:21.888 }, 00:20:21.888 "memory_domains": [ 00:20:21.888 { 00:20:21.888 "dma_device_id": "system", 00:20:21.888 "dma_device_type": 1 00:20:21.888 }, 00:20:21.888 { 00:20:21.888 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.888 "dma_device_type": 2 00:20:21.888 } 00:20:21.888 ], 00:20:21.888 "driver_specific": {} 00:20:21.888 }' 00:20:21.888 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.147 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.407 "name": "BaseBdev2", 00:20:22.407 "aliases": [ 00:20:22.407 "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7" 00:20:22.407 ], 00:20:22.407 "product_name": "Malloc disk", 00:20:22.407 "block_size": 512, 00:20:22.407 "num_blocks": 65536, 00:20:22.407 "uuid": "1543fa03-bcd0-430f-9dc1-1d4897d5c9a7", 00:20:22.407 "assigned_rate_limits": { 00:20:22.407 "rw_ios_per_sec": 0, 00:20:22.407 "rw_mbytes_per_sec": 0, 00:20:22.407 "r_mbytes_per_sec": 0, 00:20:22.407 "w_mbytes_per_sec": 0 00:20:22.407 }, 00:20:22.407 "claimed": true, 00:20:22.407 "claim_type": "exclusive_write", 00:20:22.407 "zoned": false, 00:20:22.407 "supported_io_types": { 00:20:22.407 "read": true, 00:20:22.407 "write": true, 00:20:22.407 "unmap": true, 00:20:22.407 "flush": true, 00:20:22.407 "reset": true, 00:20:22.407 "nvme_admin": false, 00:20:22.407 "nvme_io": false, 00:20:22.407 "nvme_io_md": false, 00:20:22.407 "write_zeroes": true, 00:20:22.407 "zcopy": true, 00:20:22.407 "get_zone_info": false, 00:20:22.407 "zone_management": false, 00:20:22.407 "zone_append": false, 00:20:22.407 "compare": false, 00:20:22.407 "compare_and_write": false, 00:20:22.407 "abort": true, 00:20:22.407 "seek_hole": false, 00:20:22.407 "seek_data": false, 00:20:22.407 "copy": true, 00:20:22.407 "nvme_iov_md": false 00:20:22.407 }, 00:20:22.407 "memory_domains": [ 00:20:22.407 { 00:20:22.407 "dma_device_id": "system", 00:20:22.407 "dma_device_type": 1 00:20:22.407 }, 00:20:22.407 { 00:20:22.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.407 "dma_device_type": 2 00:20:22.407 } 00:20:22.407 ], 00:20:22.407 "driver_specific": {} 00:20:22.407 }' 00:20:22.407 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:22.677 15:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.677 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:22.677 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:22.677 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.940 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:22.941 "name": "BaseBdev3", 00:20:22.941 "aliases": [ 00:20:22.941 "026ab199-e9ca-4bdc-82a1-3a0849cac086" 00:20:22.941 ], 00:20:22.941 "product_name": "Malloc disk", 00:20:22.941 "block_size": 512, 00:20:22.941 "num_blocks": 65536, 00:20:22.941 "uuid": "026ab199-e9ca-4bdc-82a1-3a0849cac086", 00:20:22.941 "assigned_rate_limits": { 00:20:22.941 "rw_ios_per_sec": 0, 00:20:22.941 "rw_mbytes_per_sec": 0, 00:20:22.941 "r_mbytes_per_sec": 0, 00:20:22.941 "w_mbytes_per_sec": 0 00:20:22.941 }, 00:20:22.941 "claimed": true, 00:20:22.941 "claim_type": "exclusive_write", 00:20:22.941 "zoned": false, 00:20:22.941 "supported_io_types": { 00:20:22.941 "read": true, 00:20:22.941 "write": true, 00:20:22.941 "unmap": true, 00:20:22.941 "flush": true, 00:20:22.941 "reset": true, 00:20:22.941 "nvme_admin": false, 00:20:22.941 "nvme_io": false, 00:20:22.941 "nvme_io_md": false, 00:20:22.941 "write_zeroes": true, 00:20:22.941 "zcopy": true, 00:20:22.941 "get_zone_info": false, 00:20:22.941 "zone_management": false, 00:20:22.941 "zone_append": false, 00:20:22.941 "compare": false, 00:20:22.941 "compare_and_write": false, 00:20:22.941 "abort": true, 00:20:22.941 "seek_hole": false, 00:20:22.941 "seek_data": false, 00:20:22.941 "copy": true, 00:20:22.941 "nvme_iov_md": false 00:20:22.941 }, 00:20:22.941 "memory_domains": [ 00:20:22.941 { 00:20:22.941 "dma_device_id": "system", 00:20:22.941 "dma_device_type": 1 00:20:22.941 }, 00:20:22.941 { 00:20:22.941 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.941 "dma_device_type": 2 00:20:22.941 } 00:20:22.941 ], 00:20:22.941 "driver_specific": {} 00:20:22.941 }' 00:20:22.941 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.200 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.460 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.460 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.460 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.460 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:23.460 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.720 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.720 "name": "BaseBdev4", 00:20:23.720 "aliases": [ 00:20:23.720 "62cc998d-dc00-47a4-b215-8563ce05d5db" 00:20:23.720 ], 00:20:23.720 "product_name": "Malloc disk", 00:20:23.720 "block_size": 512, 00:20:23.720 "num_blocks": 65536, 00:20:23.720 "uuid": "62cc998d-dc00-47a4-b215-8563ce05d5db", 00:20:23.720 "assigned_rate_limits": { 00:20:23.720 "rw_ios_per_sec": 0, 00:20:23.720 "rw_mbytes_per_sec": 0, 00:20:23.720 "r_mbytes_per_sec": 0, 00:20:23.720 "w_mbytes_per_sec": 0 00:20:23.720 }, 00:20:23.720 "claimed": true, 00:20:23.720 "claim_type": "exclusive_write", 00:20:23.720 "zoned": false, 00:20:23.720 "supported_io_types": { 00:20:23.720 "read": true, 00:20:23.720 "write": true, 00:20:23.720 "unmap": true, 00:20:23.720 "flush": true, 00:20:23.720 "reset": true, 00:20:23.720 "nvme_admin": false, 00:20:23.720 "nvme_io": false, 00:20:23.720 "nvme_io_md": false, 00:20:23.720 "write_zeroes": true, 00:20:23.720 "zcopy": true, 00:20:23.720 "get_zone_info": false, 00:20:23.720 "zone_management": false, 00:20:23.720 "zone_append": false, 00:20:23.720 "compare": false, 00:20:23.720 "compare_and_write": false, 00:20:23.720 "abort": true, 00:20:23.720 "seek_hole": false, 00:20:23.721 "seek_data": false, 00:20:23.721 "copy": true, 00:20:23.721 "nvme_iov_md": false 00:20:23.721 }, 00:20:23.721 "memory_domains": [ 00:20:23.721 { 00:20:23.721 "dma_device_id": "system", 00:20:23.721 "dma_device_type": 1 00:20:23.721 }, 00:20:23.721 { 00:20:23.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.721 "dma_device_type": 2 00:20:23.721 } 00:20:23.721 ], 00:20:23.721 "driver_specific": {} 00:20:23.721 }' 00:20:23.721 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.721 15:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.721 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.981 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.981 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.981 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.981 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.981 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:24.240 [2024-07-12 15:56:44.455632] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:24.240 [2024-07-12 15:56:44.455648] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:24.240 [2024-07-12 15:56:44.455685] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:24.240 [2024-07-12 15:56:44.455741] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:24.240 [2024-07-12 15:56:44.455749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1134c90 name Existed_Raid, state offline 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2590229 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2590229 ']' 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2590229 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2590229 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2590229' 00:20:24.240 killing process with pid 2590229 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2590229 00:20:24.240 [2024-07-12 15:56:44.520772] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2590229 00:20:24.240 [2024-07-12 15:56:44.541195] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:24.240 00:20:24.240 real 0m27.671s 00:20:24.240 user 0m51.864s 00:20:24.240 sys 0m4.107s 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:24.240 15:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:24.240 ************************************ 00:20:24.240 END TEST raid_state_function_test_sb 00:20:24.240 ************************************ 00:20:24.501 15:56:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:24.501 15:56:44 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:20:24.501 15:56:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:24.501 15:56:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:24.501 15:56:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:24.501 ************************************ 00:20:24.501 START TEST raid_superblock_test 00:20:24.501 ************************************ 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2595495 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2595495 /var/tmp/spdk-raid.sock 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2595495 ']' 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:24.501 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:24.501 15:56:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:24.501 [2024-07-12 15:56:44.803963] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:20:24.501 [2024-07-12 15:56:44.804019] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2595495 ] 00:20:24.501 [2024-07-12 15:56:44.895292] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:24.761 [2024-07-12 15:56:44.963183] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:24.761 [2024-07-12 15:56:45.002702] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:24.761 [2024-07-12 15:56:45.002727] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:25.393 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:25.393 malloc1 00:20:25.653 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:25.653 [2024-07-12 15:56:46.028950] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:25.653 [2024-07-12 15:56:46.028986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:25.653 [2024-07-12 15:56:46.028999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1375b50 00:20:25.653 [2024-07-12 15:56:46.029006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:25.653 [2024-07-12 15:56:46.030312] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:25.653 [2024-07-12 15:56:46.030333] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:25.653 pt1 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:25.653 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:25.912 malloc2 00:20:25.912 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:26.170 [2024-07-12 15:56:46.415956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:26.170 [2024-07-12 15:56:46.415984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.170 [2024-07-12 15:56:46.415994] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1376df0 00:20:26.170 [2024-07-12 15:56:46.416001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.170 [2024-07-12 15:56:46.417159] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.170 [2024-07-12 15:56:46.417178] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:26.170 pt2 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:26.170 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:26.428 malloc3 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:26.428 [2024-07-12 15:56:46.798608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:26.428 [2024-07-12 15:56:46.798636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.428 [2024-07-12 15:56:46.798645] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1376770 00:20:26.428 [2024-07-12 15:56:46.798652] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.428 [2024-07-12 15:56:46.799809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.428 [2024-07-12 15:56:46.799828] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:26.428 pt3 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:26.428 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:26.429 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:26.429 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:26.687 malloc4 00:20:26.687 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:26.946 [2024-07-12 15:56:47.201252] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:26.946 [2024-07-12 15:56:47.201282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:26.946 [2024-07-12 15:56:47.201293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x136d840 00:20:26.946 [2024-07-12 15:56:47.201300] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:26.946 [2024-07-12 15:56:47.202453] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:26.946 [2024-07-12 15:56:47.202472] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:26.946 pt4 00:20:26.946 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:26.946 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:26.946 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:26.946 [2024-07-12 15:56:47.385733] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:26.946 [2024-07-12 15:56:47.386716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:26.946 [2024-07-12 15:56:47.386758] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:26.946 [2024-07-12 15:56:47.386794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:26.946 [2024-07-12 15:56:47.386929] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15274c0 00:20:26.946 [2024-07-12 15:56:47.386936] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:26.946 [2024-07-12 15:56:47.387078] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1376570 00:20:26.946 [2024-07-12 15:56:47.387189] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15274c0 00:20:26.946 [2024-07-12 15:56:47.387198] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15274c0 00:20:26.946 [2024-07-12 15:56:47.387268] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:27.208 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:27.208 "name": "raid_bdev1", 00:20:27.208 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:27.208 "strip_size_kb": 64, 00:20:27.208 "state": "online", 00:20:27.208 "raid_level": "concat", 00:20:27.208 "superblock": true, 00:20:27.208 "num_base_bdevs": 4, 00:20:27.208 "num_base_bdevs_discovered": 4, 00:20:27.208 "num_base_bdevs_operational": 4, 00:20:27.208 "base_bdevs_list": [ 00:20:27.209 { 00:20:27.209 "name": "pt1", 00:20:27.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:27.209 "is_configured": true, 00:20:27.209 "data_offset": 2048, 00:20:27.209 "data_size": 63488 00:20:27.209 }, 00:20:27.209 { 00:20:27.209 "name": "pt2", 00:20:27.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:27.209 "is_configured": true, 00:20:27.209 "data_offset": 2048, 00:20:27.209 "data_size": 63488 00:20:27.209 }, 00:20:27.209 { 00:20:27.209 "name": "pt3", 00:20:27.209 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:27.209 "is_configured": true, 00:20:27.209 "data_offset": 2048, 00:20:27.209 "data_size": 63488 00:20:27.209 }, 00:20:27.209 { 00:20:27.209 "name": "pt4", 00:20:27.209 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:27.209 "is_configured": true, 00:20:27.209 "data_offset": 2048, 00:20:27.209 "data_size": 63488 00:20:27.209 } 00:20:27.209 ] 00:20:27.209 }' 00:20:27.209 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:27.209 15:56:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:27.779 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:28.039 [2024-07-12 15:56:48.328349] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:28.039 "name": "raid_bdev1", 00:20:28.039 "aliases": [ 00:20:28.039 "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f" 00:20:28.039 ], 00:20:28.039 "product_name": "Raid Volume", 00:20:28.039 "block_size": 512, 00:20:28.039 "num_blocks": 253952, 00:20:28.039 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:28.039 "assigned_rate_limits": { 00:20:28.039 "rw_ios_per_sec": 0, 00:20:28.039 "rw_mbytes_per_sec": 0, 00:20:28.039 "r_mbytes_per_sec": 0, 00:20:28.039 "w_mbytes_per_sec": 0 00:20:28.039 }, 00:20:28.039 "claimed": false, 00:20:28.039 "zoned": false, 00:20:28.039 "supported_io_types": { 00:20:28.039 "read": true, 00:20:28.039 "write": true, 00:20:28.039 "unmap": true, 00:20:28.039 "flush": true, 00:20:28.039 "reset": true, 00:20:28.039 "nvme_admin": false, 00:20:28.039 "nvme_io": false, 00:20:28.039 "nvme_io_md": false, 00:20:28.039 "write_zeroes": true, 00:20:28.039 "zcopy": false, 00:20:28.039 "get_zone_info": false, 00:20:28.039 "zone_management": false, 00:20:28.039 "zone_append": false, 00:20:28.039 "compare": false, 00:20:28.039 "compare_and_write": false, 00:20:28.039 "abort": false, 00:20:28.039 "seek_hole": false, 00:20:28.039 "seek_data": false, 00:20:28.039 "copy": false, 00:20:28.039 "nvme_iov_md": false 00:20:28.039 }, 00:20:28.039 "memory_domains": [ 00:20:28.039 { 00:20:28.039 "dma_device_id": "system", 00:20:28.039 "dma_device_type": 1 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.039 "dma_device_type": 2 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "system", 00:20:28.039 "dma_device_type": 1 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.039 "dma_device_type": 2 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "system", 00:20:28.039 "dma_device_type": 1 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.039 "dma_device_type": 2 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "system", 00:20:28.039 "dma_device_type": 1 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.039 "dma_device_type": 2 00:20:28.039 } 00:20:28.039 ], 00:20:28.039 "driver_specific": { 00:20:28.039 "raid": { 00:20:28.039 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:28.039 "strip_size_kb": 64, 00:20:28.039 "state": "online", 00:20:28.039 "raid_level": "concat", 00:20:28.039 "superblock": true, 00:20:28.039 "num_base_bdevs": 4, 00:20:28.039 "num_base_bdevs_discovered": 4, 00:20:28.039 "num_base_bdevs_operational": 4, 00:20:28.039 "base_bdevs_list": [ 00:20:28.039 { 00:20:28.039 "name": "pt1", 00:20:28.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:28.039 "is_configured": true, 00:20:28.039 "data_offset": 2048, 00:20:28.039 "data_size": 63488 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "name": "pt2", 00:20:28.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:28.039 "is_configured": true, 00:20:28.039 "data_offset": 2048, 00:20:28.039 "data_size": 63488 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "name": "pt3", 00:20:28.039 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:28.039 "is_configured": true, 00:20:28.039 "data_offset": 2048, 00:20:28.039 "data_size": 63488 00:20:28.039 }, 00:20:28.039 { 00:20:28.039 "name": "pt4", 00:20:28.039 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:28.039 "is_configured": true, 00:20:28.039 "data_offset": 2048, 00:20:28.039 "data_size": 63488 00:20:28.039 } 00:20:28.039 ] 00:20:28.039 } 00:20:28.039 } 00:20:28.039 }' 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:28.039 pt2 00:20:28.039 pt3 00:20:28.039 pt4' 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:28.039 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:28.300 "name": "pt1", 00:20:28.300 "aliases": [ 00:20:28.300 "00000000-0000-0000-0000-000000000001" 00:20:28.300 ], 00:20:28.300 "product_name": "passthru", 00:20:28.300 "block_size": 512, 00:20:28.300 "num_blocks": 65536, 00:20:28.300 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:28.300 "assigned_rate_limits": { 00:20:28.300 "rw_ios_per_sec": 0, 00:20:28.300 "rw_mbytes_per_sec": 0, 00:20:28.300 "r_mbytes_per_sec": 0, 00:20:28.300 "w_mbytes_per_sec": 0 00:20:28.300 }, 00:20:28.300 "claimed": true, 00:20:28.300 "claim_type": "exclusive_write", 00:20:28.300 "zoned": false, 00:20:28.300 "supported_io_types": { 00:20:28.300 "read": true, 00:20:28.300 "write": true, 00:20:28.300 "unmap": true, 00:20:28.300 "flush": true, 00:20:28.300 "reset": true, 00:20:28.300 "nvme_admin": false, 00:20:28.300 "nvme_io": false, 00:20:28.300 "nvme_io_md": false, 00:20:28.300 "write_zeroes": true, 00:20:28.300 "zcopy": true, 00:20:28.300 "get_zone_info": false, 00:20:28.300 "zone_management": false, 00:20:28.300 "zone_append": false, 00:20:28.300 "compare": false, 00:20:28.300 "compare_and_write": false, 00:20:28.300 "abort": true, 00:20:28.300 "seek_hole": false, 00:20:28.300 "seek_data": false, 00:20:28.300 "copy": true, 00:20:28.300 "nvme_iov_md": false 00:20:28.300 }, 00:20:28.300 "memory_domains": [ 00:20:28.300 { 00:20:28.300 "dma_device_id": "system", 00:20:28.300 "dma_device_type": 1 00:20:28.300 }, 00:20:28.300 { 00:20:28.300 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.300 "dma_device_type": 2 00:20:28.300 } 00:20:28.300 ], 00:20:28.300 "driver_specific": { 00:20:28.300 "passthru": { 00:20:28.300 "name": "pt1", 00:20:28.300 "base_bdev_name": "malloc1" 00:20:28.300 } 00:20:28.300 } 00:20:28.300 }' 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:28.300 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:28.560 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:28.819 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:28.819 "name": "pt2", 00:20:28.819 "aliases": [ 00:20:28.819 "00000000-0000-0000-0000-000000000002" 00:20:28.819 ], 00:20:28.819 "product_name": "passthru", 00:20:28.819 "block_size": 512, 00:20:28.819 "num_blocks": 65536, 00:20:28.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:28.819 "assigned_rate_limits": { 00:20:28.819 "rw_ios_per_sec": 0, 00:20:28.819 "rw_mbytes_per_sec": 0, 00:20:28.819 "r_mbytes_per_sec": 0, 00:20:28.819 "w_mbytes_per_sec": 0 00:20:28.819 }, 00:20:28.819 "claimed": true, 00:20:28.819 "claim_type": "exclusive_write", 00:20:28.819 "zoned": false, 00:20:28.819 "supported_io_types": { 00:20:28.819 "read": true, 00:20:28.819 "write": true, 00:20:28.819 "unmap": true, 00:20:28.819 "flush": true, 00:20:28.819 "reset": true, 00:20:28.819 "nvme_admin": false, 00:20:28.819 "nvme_io": false, 00:20:28.819 "nvme_io_md": false, 00:20:28.819 "write_zeroes": true, 00:20:28.819 "zcopy": true, 00:20:28.819 "get_zone_info": false, 00:20:28.819 "zone_management": false, 00:20:28.819 "zone_append": false, 00:20:28.819 "compare": false, 00:20:28.819 "compare_and_write": false, 00:20:28.819 "abort": true, 00:20:28.819 "seek_hole": false, 00:20:28.819 "seek_data": false, 00:20:28.819 "copy": true, 00:20:28.819 "nvme_iov_md": false 00:20:28.819 }, 00:20:28.819 "memory_domains": [ 00:20:28.819 { 00:20:28.819 "dma_device_id": "system", 00:20:28.819 "dma_device_type": 1 00:20:28.819 }, 00:20:28.819 { 00:20:28.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:28.819 "dma_device_type": 2 00:20:28.819 } 00:20:28.819 ], 00:20:28.819 "driver_specific": { 00:20:28.819 "passthru": { 00:20:28.819 "name": "pt2", 00:20:28.819 "base_bdev_name": "malloc2" 00:20:28.819 } 00:20:28.819 } 00:20:28.819 }' 00:20:28.819 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.819 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:28.819 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:28.820 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:29.079 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:29.339 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:29.339 "name": "pt3", 00:20:29.339 "aliases": [ 00:20:29.339 "00000000-0000-0000-0000-000000000003" 00:20:29.339 ], 00:20:29.339 "product_name": "passthru", 00:20:29.339 "block_size": 512, 00:20:29.339 "num_blocks": 65536, 00:20:29.339 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:29.339 "assigned_rate_limits": { 00:20:29.339 "rw_ios_per_sec": 0, 00:20:29.339 "rw_mbytes_per_sec": 0, 00:20:29.339 "r_mbytes_per_sec": 0, 00:20:29.339 "w_mbytes_per_sec": 0 00:20:29.339 }, 00:20:29.339 "claimed": true, 00:20:29.339 "claim_type": "exclusive_write", 00:20:29.339 "zoned": false, 00:20:29.339 "supported_io_types": { 00:20:29.339 "read": true, 00:20:29.339 "write": true, 00:20:29.339 "unmap": true, 00:20:29.339 "flush": true, 00:20:29.339 "reset": true, 00:20:29.339 "nvme_admin": false, 00:20:29.339 "nvme_io": false, 00:20:29.339 "nvme_io_md": false, 00:20:29.339 "write_zeroes": true, 00:20:29.339 "zcopy": true, 00:20:29.339 "get_zone_info": false, 00:20:29.339 "zone_management": false, 00:20:29.339 "zone_append": false, 00:20:29.339 "compare": false, 00:20:29.339 "compare_and_write": false, 00:20:29.339 "abort": true, 00:20:29.339 "seek_hole": false, 00:20:29.339 "seek_data": false, 00:20:29.339 "copy": true, 00:20:29.339 "nvme_iov_md": false 00:20:29.339 }, 00:20:29.339 "memory_domains": [ 00:20:29.339 { 00:20:29.339 "dma_device_id": "system", 00:20:29.339 "dma_device_type": 1 00:20:29.339 }, 00:20:29.339 { 00:20:29.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.339 "dma_device_type": 2 00:20:29.339 } 00:20:29.339 ], 00:20:29.339 "driver_specific": { 00:20:29.339 "passthru": { 00:20:29.339 "name": "pt3", 00:20:29.339 "base_bdev_name": "malloc3" 00:20:29.339 } 00:20:29.339 } 00:20:29.339 }' 00:20:29.339 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:29.339 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:29.339 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:29.339 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:29.598 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:29.599 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:29.599 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:29.599 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:29.599 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:29.599 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:29.858 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:29.858 "name": "pt4", 00:20:29.858 "aliases": [ 00:20:29.858 "00000000-0000-0000-0000-000000000004" 00:20:29.858 ], 00:20:29.858 "product_name": "passthru", 00:20:29.858 "block_size": 512, 00:20:29.858 "num_blocks": 65536, 00:20:29.858 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:29.858 "assigned_rate_limits": { 00:20:29.858 "rw_ios_per_sec": 0, 00:20:29.858 "rw_mbytes_per_sec": 0, 00:20:29.858 "r_mbytes_per_sec": 0, 00:20:29.858 "w_mbytes_per_sec": 0 00:20:29.858 }, 00:20:29.858 "claimed": true, 00:20:29.858 "claim_type": "exclusive_write", 00:20:29.858 "zoned": false, 00:20:29.858 "supported_io_types": { 00:20:29.858 "read": true, 00:20:29.858 "write": true, 00:20:29.858 "unmap": true, 00:20:29.858 "flush": true, 00:20:29.858 "reset": true, 00:20:29.858 "nvme_admin": false, 00:20:29.858 "nvme_io": false, 00:20:29.858 "nvme_io_md": false, 00:20:29.858 "write_zeroes": true, 00:20:29.858 "zcopy": true, 00:20:29.858 "get_zone_info": false, 00:20:29.858 "zone_management": false, 00:20:29.858 "zone_append": false, 00:20:29.858 "compare": false, 00:20:29.858 "compare_and_write": false, 00:20:29.858 "abort": true, 00:20:29.858 "seek_hole": false, 00:20:29.858 "seek_data": false, 00:20:29.858 "copy": true, 00:20:29.858 "nvme_iov_md": false 00:20:29.858 }, 00:20:29.858 "memory_domains": [ 00:20:29.858 { 00:20:29.858 "dma_device_id": "system", 00:20:29.858 "dma_device_type": 1 00:20:29.858 }, 00:20:29.858 { 00:20:29.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.858 "dma_device_type": 2 00:20:29.858 } 00:20:29.858 ], 00:20:29.858 "driver_specific": { 00:20:29.858 "passthru": { 00:20:29.858 "name": "pt4", 00:20:29.858 "base_bdev_name": "malloc4" 00:20:29.858 } 00:20:29.858 } 00:20:29.858 }' 00:20:29.858 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:29.858 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:29.858 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:29.858 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:30.118 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:30.377 [2024-07-12 15:56:50.686293] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:30.377 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2c556e7d-c7e6-4a0a-b0be-9d37a812a99f 00:20:30.377 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2c556e7d-c7e6-4a0a-b0be-9d37a812a99f ']' 00:20:30.377 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:30.637 [2024-07-12 15:56:50.882549] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:30.637 [2024-07-12 15:56:50.882563] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:30.637 [2024-07-12 15:56:50.882602] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:30.637 [2024-07-12 15:56:50.882651] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:30.637 [2024-07-12 15:56:50.882657] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15274c0 name raid_bdev1, state offline 00:20:30.637 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.637 15:56:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:30.897 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:31.156 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:31.156 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:31.416 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:31.416 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:31.676 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:31.676 15:56:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:31.676 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:31.935 [2024-07-12 15:56:52.225924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:31.935 [2024-07-12 15:56:52.227000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:31.935 [2024-07-12 15:56:52.227033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:31.935 [2024-07-12 15:56:52.227059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:31.935 [2024-07-12 15:56:52.227093] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:31.935 [2024-07-12 15:56:52.227120] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:31.935 [2024-07-12 15:56:52.227134] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:31.935 [2024-07-12 15:56:52.227147] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:31.935 [2024-07-12 15:56:52.227157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:31.935 [2024-07-12 15:56:52.227163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151c0f0 name raid_bdev1, state configuring 00:20:31.935 request: 00:20:31.935 { 00:20:31.935 "name": "raid_bdev1", 00:20:31.935 "raid_level": "concat", 00:20:31.935 "base_bdevs": [ 00:20:31.935 "malloc1", 00:20:31.935 "malloc2", 00:20:31.935 "malloc3", 00:20:31.935 "malloc4" 00:20:31.935 ], 00:20:31.935 "strip_size_kb": 64, 00:20:31.935 "superblock": false, 00:20:31.935 "method": "bdev_raid_create", 00:20:31.935 "req_id": 1 00:20:31.935 } 00:20:31.935 Got JSON-RPC error response 00:20:31.935 response: 00:20:31.935 { 00:20:31.935 "code": -17, 00:20:31.935 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:31.935 } 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.935 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:32.193 [2024-07-12 15:56:52.610833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:32.193 [2024-07-12 15:56:52.610863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:32.193 [2024-07-12 15:56:52.610875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151ecc0 00:20:32.193 [2024-07-12 15:56:52.610881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:32.193 [2024-07-12 15:56:52.612156] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:32.193 [2024-07-12 15:56:52.612177] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:32.193 [2024-07-12 15:56:52.612224] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:32.193 [2024-07-12 15:56:52.612244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:32.193 pt1 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.193 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:32.451 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.451 "name": "raid_bdev1", 00:20:32.451 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:32.451 "strip_size_kb": 64, 00:20:32.451 "state": "configuring", 00:20:32.451 "raid_level": "concat", 00:20:32.451 "superblock": true, 00:20:32.451 "num_base_bdevs": 4, 00:20:32.451 "num_base_bdevs_discovered": 1, 00:20:32.452 "num_base_bdevs_operational": 4, 00:20:32.452 "base_bdevs_list": [ 00:20:32.452 { 00:20:32.452 "name": "pt1", 00:20:32.452 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:32.452 "is_configured": true, 00:20:32.452 "data_offset": 2048, 00:20:32.452 "data_size": 63488 00:20:32.452 }, 00:20:32.452 { 00:20:32.452 "name": null, 00:20:32.452 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:32.452 "is_configured": false, 00:20:32.452 "data_offset": 2048, 00:20:32.452 "data_size": 63488 00:20:32.452 }, 00:20:32.452 { 00:20:32.452 "name": null, 00:20:32.452 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:32.452 "is_configured": false, 00:20:32.452 "data_offset": 2048, 00:20:32.452 "data_size": 63488 00:20:32.452 }, 00:20:32.452 { 00:20:32.452 "name": null, 00:20:32.452 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:32.452 "is_configured": false, 00:20:32.452 "data_offset": 2048, 00:20:32.452 "data_size": 63488 00:20:32.452 } 00:20:32.452 ] 00:20:32.452 }' 00:20:32.452 15:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.452 15:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.021 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:33.021 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:33.280 [2024-07-12 15:56:53.525161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:33.280 [2024-07-12 15:56:53.525201] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:33.280 [2024-07-12 15:56:53.525216] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151d8f0 00:20:33.280 [2024-07-12 15:56:53.525223] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:33.280 [2024-07-12 15:56:53.525503] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:33.280 [2024-07-12 15:56:53.525515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:33.280 [2024-07-12 15:56:53.525562] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:33.280 [2024-07-12 15:56:53.525576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:33.280 pt2 00:20:33.280 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:33.280 [2024-07-12 15:56:53.713651] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.539 "name": "raid_bdev1", 00:20:33.539 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:33.539 "strip_size_kb": 64, 00:20:33.539 "state": "configuring", 00:20:33.539 "raid_level": "concat", 00:20:33.539 "superblock": true, 00:20:33.539 "num_base_bdevs": 4, 00:20:33.539 "num_base_bdevs_discovered": 1, 00:20:33.539 "num_base_bdevs_operational": 4, 00:20:33.539 "base_bdevs_list": [ 00:20:33.539 { 00:20:33.539 "name": "pt1", 00:20:33.539 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:33.539 "is_configured": true, 00:20:33.539 "data_offset": 2048, 00:20:33.539 "data_size": 63488 00:20:33.539 }, 00:20:33.539 { 00:20:33.539 "name": null, 00:20:33.539 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:33.539 "is_configured": false, 00:20:33.539 "data_offset": 2048, 00:20:33.539 "data_size": 63488 00:20:33.539 }, 00:20:33.539 { 00:20:33.539 "name": null, 00:20:33.539 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:33.539 "is_configured": false, 00:20:33.539 "data_offset": 2048, 00:20:33.539 "data_size": 63488 00:20:33.539 }, 00:20:33.539 { 00:20:33.539 "name": null, 00:20:33.539 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:33.539 "is_configured": false, 00:20:33.539 "data_offset": 2048, 00:20:33.539 "data_size": 63488 00:20:33.539 } 00:20:33.539 ] 00:20:33.539 }' 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.539 15:56:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.108 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:34.108 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:34.108 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:34.367 [2024-07-12 15:56:54.648025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:34.367 [2024-07-12 15:56:54.648063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.367 [2024-07-12 15:56:54.648076] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151a9c0 00:20:34.367 [2024-07-12 15:56:54.648082] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.367 [2024-07-12 15:56:54.648366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.367 [2024-07-12 15:56:54.648378] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:34.367 [2024-07-12 15:56:54.648423] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:34.367 [2024-07-12 15:56:54.648438] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:34.367 pt2 00:20:34.367 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:34.367 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:34.367 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:34.627 [2024-07-12 15:56:54.844527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:34.627 [2024-07-12 15:56:54.844559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.627 [2024-07-12 15:56:54.844571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x151ad50 00:20:34.627 [2024-07-12 15:56:54.844576] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.627 [2024-07-12 15:56:54.844828] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.627 [2024-07-12 15:56:54.844839] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:34.627 [2024-07-12 15:56:54.844876] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:34.627 [2024-07-12 15:56:54.844887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:34.627 pt3 00:20:34.627 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:34.627 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:34.627 15:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:34.627 [2024-07-12 15:56:55.028991] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:34.627 [2024-07-12 15:56:55.029011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:34.627 [2024-07-12 15:56:55.029020] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x136d1a0 00:20:34.627 [2024-07-12 15:56:55.029025] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:34.627 [2024-07-12 15:56:55.029240] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:34.627 [2024-07-12 15:56:55.029251] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:34.627 [2024-07-12 15:56:55.029283] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:34.627 [2024-07-12 15:56:55.029294] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:34.627 [2024-07-12 15:56:55.029392] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151a140 00:20:34.627 [2024-07-12 15:56:55.029398] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:34.627 [2024-07-12 15:56:55.029533] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1373260 00:20:34.627 [2024-07-12 15:56:55.029632] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151a140 00:20:34.627 [2024-07-12 15:56:55.029637] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151a140 00:20:34.628 [2024-07-12 15:56:55.029717] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:34.628 pt4 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.628 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:34.887 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.887 "name": "raid_bdev1", 00:20:34.887 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:34.887 "strip_size_kb": 64, 00:20:34.887 "state": "online", 00:20:34.887 "raid_level": "concat", 00:20:34.887 "superblock": true, 00:20:34.887 "num_base_bdevs": 4, 00:20:34.887 "num_base_bdevs_discovered": 4, 00:20:34.887 "num_base_bdevs_operational": 4, 00:20:34.887 "base_bdevs_list": [ 00:20:34.887 { 00:20:34.887 "name": "pt1", 00:20:34.887 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:34.887 "is_configured": true, 00:20:34.887 "data_offset": 2048, 00:20:34.887 "data_size": 63488 00:20:34.887 }, 00:20:34.887 { 00:20:34.887 "name": "pt2", 00:20:34.887 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:34.887 "is_configured": true, 00:20:34.887 "data_offset": 2048, 00:20:34.887 "data_size": 63488 00:20:34.887 }, 00:20:34.887 { 00:20:34.887 "name": "pt3", 00:20:34.887 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:34.887 "is_configured": true, 00:20:34.887 "data_offset": 2048, 00:20:34.887 "data_size": 63488 00:20:34.887 }, 00:20:34.887 { 00:20:34.887 "name": "pt4", 00:20:34.887 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:34.887 "is_configured": true, 00:20:34.887 "data_offset": 2048, 00:20:34.887 "data_size": 63488 00:20:34.887 } 00:20:34.888 ] 00:20:34.888 }' 00:20:34.888 15:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.888 15:56:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:35.826 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:35.826 [2024-07-12 15:56:56.264366] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:36.087 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:36.087 "name": "raid_bdev1", 00:20:36.087 "aliases": [ 00:20:36.087 "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f" 00:20:36.087 ], 00:20:36.087 "product_name": "Raid Volume", 00:20:36.087 "block_size": 512, 00:20:36.087 "num_blocks": 253952, 00:20:36.087 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:36.087 "assigned_rate_limits": { 00:20:36.087 "rw_ios_per_sec": 0, 00:20:36.087 "rw_mbytes_per_sec": 0, 00:20:36.087 "r_mbytes_per_sec": 0, 00:20:36.087 "w_mbytes_per_sec": 0 00:20:36.087 }, 00:20:36.087 "claimed": false, 00:20:36.087 "zoned": false, 00:20:36.087 "supported_io_types": { 00:20:36.087 "read": true, 00:20:36.087 "write": true, 00:20:36.087 "unmap": true, 00:20:36.087 "flush": true, 00:20:36.087 "reset": true, 00:20:36.087 "nvme_admin": false, 00:20:36.087 "nvme_io": false, 00:20:36.087 "nvme_io_md": false, 00:20:36.087 "write_zeroes": true, 00:20:36.087 "zcopy": false, 00:20:36.087 "get_zone_info": false, 00:20:36.087 "zone_management": false, 00:20:36.087 "zone_append": false, 00:20:36.087 "compare": false, 00:20:36.087 "compare_and_write": false, 00:20:36.087 "abort": false, 00:20:36.087 "seek_hole": false, 00:20:36.087 "seek_data": false, 00:20:36.087 "copy": false, 00:20:36.087 "nvme_iov_md": false 00:20:36.087 }, 00:20:36.087 "memory_domains": [ 00:20:36.087 { 00:20:36.087 "dma_device_id": "system", 00:20:36.087 "dma_device_type": 1 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.087 "dma_device_type": 2 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "system", 00:20:36.087 "dma_device_type": 1 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.087 "dma_device_type": 2 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "system", 00:20:36.087 "dma_device_type": 1 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.087 "dma_device_type": 2 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "system", 00:20:36.087 "dma_device_type": 1 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.087 "dma_device_type": 2 00:20:36.087 } 00:20:36.087 ], 00:20:36.087 "driver_specific": { 00:20:36.087 "raid": { 00:20:36.087 "uuid": "2c556e7d-c7e6-4a0a-b0be-9d37a812a99f", 00:20:36.087 "strip_size_kb": 64, 00:20:36.087 "state": "online", 00:20:36.087 "raid_level": "concat", 00:20:36.087 "superblock": true, 00:20:36.087 "num_base_bdevs": 4, 00:20:36.087 "num_base_bdevs_discovered": 4, 00:20:36.087 "num_base_bdevs_operational": 4, 00:20:36.087 "base_bdevs_list": [ 00:20:36.087 { 00:20:36.087 "name": "pt1", 00:20:36.087 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:36.087 "is_configured": true, 00:20:36.087 "data_offset": 2048, 00:20:36.087 "data_size": 63488 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "name": "pt2", 00:20:36.087 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:36.087 "is_configured": true, 00:20:36.087 "data_offset": 2048, 00:20:36.087 "data_size": 63488 00:20:36.087 }, 00:20:36.087 { 00:20:36.087 "name": "pt3", 00:20:36.088 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:36.088 "is_configured": true, 00:20:36.088 "data_offset": 2048, 00:20:36.088 "data_size": 63488 00:20:36.088 }, 00:20:36.088 { 00:20:36.088 "name": "pt4", 00:20:36.088 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:36.088 "is_configured": true, 00:20:36.088 "data_offset": 2048, 00:20:36.088 "data_size": 63488 00:20:36.088 } 00:20:36.088 ] 00:20:36.088 } 00:20:36.088 } 00:20:36.088 }' 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:36.088 pt2 00:20:36.088 pt3 00:20:36.088 pt4' 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:36.088 "name": "pt1", 00:20:36.088 "aliases": [ 00:20:36.088 "00000000-0000-0000-0000-000000000001" 00:20:36.088 ], 00:20:36.088 "product_name": "passthru", 00:20:36.088 "block_size": 512, 00:20:36.088 "num_blocks": 65536, 00:20:36.088 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:36.088 "assigned_rate_limits": { 00:20:36.088 "rw_ios_per_sec": 0, 00:20:36.088 "rw_mbytes_per_sec": 0, 00:20:36.088 "r_mbytes_per_sec": 0, 00:20:36.088 "w_mbytes_per_sec": 0 00:20:36.088 }, 00:20:36.088 "claimed": true, 00:20:36.088 "claim_type": "exclusive_write", 00:20:36.088 "zoned": false, 00:20:36.088 "supported_io_types": { 00:20:36.088 "read": true, 00:20:36.088 "write": true, 00:20:36.088 "unmap": true, 00:20:36.088 "flush": true, 00:20:36.088 "reset": true, 00:20:36.088 "nvme_admin": false, 00:20:36.088 "nvme_io": false, 00:20:36.088 "nvme_io_md": false, 00:20:36.088 "write_zeroes": true, 00:20:36.088 "zcopy": true, 00:20:36.088 "get_zone_info": false, 00:20:36.088 "zone_management": false, 00:20:36.088 "zone_append": false, 00:20:36.088 "compare": false, 00:20:36.088 "compare_and_write": false, 00:20:36.088 "abort": true, 00:20:36.088 "seek_hole": false, 00:20:36.088 "seek_data": false, 00:20:36.088 "copy": true, 00:20:36.088 "nvme_iov_md": false 00:20:36.088 }, 00:20:36.088 "memory_domains": [ 00:20:36.088 { 00:20:36.088 "dma_device_id": "system", 00:20:36.088 "dma_device_type": 1 00:20:36.088 }, 00:20:36.088 { 00:20:36.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.088 "dma_device_type": 2 00:20:36.088 } 00:20:36.088 ], 00:20:36.088 "driver_specific": { 00:20:36.088 "passthru": { 00:20:36.088 "name": "pt1", 00:20:36.088 "base_bdev_name": "malloc1" 00:20:36.088 } 00:20:36.088 } 00:20:36.088 }' 00:20:36.088 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:36.348 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.609 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:36.609 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:36.609 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:36.609 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:36.609 15:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:36.869 "name": "pt2", 00:20:36.869 "aliases": [ 00:20:36.869 "00000000-0000-0000-0000-000000000002" 00:20:36.869 ], 00:20:36.869 "product_name": "passthru", 00:20:36.869 "block_size": 512, 00:20:36.869 "num_blocks": 65536, 00:20:36.869 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:36.869 "assigned_rate_limits": { 00:20:36.869 "rw_ios_per_sec": 0, 00:20:36.869 "rw_mbytes_per_sec": 0, 00:20:36.869 "r_mbytes_per_sec": 0, 00:20:36.869 "w_mbytes_per_sec": 0 00:20:36.869 }, 00:20:36.869 "claimed": true, 00:20:36.869 "claim_type": "exclusive_write", 00:20:36.869 "zoned": false, 00:20:36.869 "supported_io_types": { 00:20:36.869 "read": true, 00:20:36.869 "write": true, 00:20:36.869 "unmap": true, 00:20:36.869 "flush": true, 00:20:36.869 "reset": true, 00:20:36.869 "nvme_admin": false, 00:20:36.869 "nvme_io": false, 00:20:36.869 "nvme_io_md": false, 00:20:36.869 "write_zeroes": true, 00:20:36.869 "zcopy": true, 00:20:36.869 "get_zone_info": false, 00:20:36.869 "zone_management": false, 00:20:36.869 "zone_append": false, 00:20:36.869 "compare": false, 00:20:36.869 "compare_and_write": false, 00:20:36.869 "abort": true, 00:20:36.869 "seek_hole": false, 00:20:36.869 "seek_data": false, 00:20:36.869 "copy": true, 00:20:36.869 "nvme_iov_md": false 00:20:36.869 }, 00:20:36.869 "memory_domains": [ 00:20:36.869 { 00:20:36.869 "dma_device_id": "system", 00:20:36.869 "dma_device_type": 1 00:20:36.869 }, 00:20:36.869 { 00:20:36.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:36.869 "dma_device_type": 2 00:20:36.869 } 00:20:36.869 ], 00:20:36.869 "driver_specific": { 00:20:36.869 "passthru": { 00:20:36.869 "name": "pt2", 00:20:36.869 "base_bdev_name": "malloc2" 00:20:36.869 } 00:20:36.869 } 00:20:36.869 }' 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:36.869 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:37.129 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:37.389 "name": "pt3", 00:20:37.389 "aliases": [ 00:20:37.389 "00000000-0000-0000-0000-000000000003" 00:20:37.389 ], 00:20:37.389 "product_name": "passthru", 00:20:37.389 "block_size": 512, 00:20:37.389 "num_blocks": 65536, 00:20:37.389 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:37.389 "assigned_rate_limits": { 00:20:37.389 "rw_ios_per_sec": 0, 00:20:37.389 "rw_mbytes_per_sec": 0, 00:20:37.389 "r_mbytes_per_sec": 0, 00:20:37.389 "w_mbytes_per_sec": 0 00:20:37.389 }, 00:20:37.389 "claimed": true, 00:20:37.389 "claim_type": "exclusive_write", 00:20:37.389 "zoned": false, 00:20:37.389 "supported_io_types": { 00:20:37.389 "read": true, 00:20:37.389 "write": true, 00:20:37.389 "unmap": true, 00:20:37.389 "flush": true, 00:20:37.389 "reset": true, 00:20:37.389 "nvme_admin": false, 00:20:37.389 "nvme_io": false, 00:20:37.389 "nvme_io_md": false, 00:20:37.389 "write_zeroes": true, 00:20:37.389 "zcopy": true, 00:20:37.389 "get_zone_info": false, 00:20:37.389 "zone_management": false, 00:20:37.389 "zone_append": false, 00:20:37.389 "compare": false, 00:20:37.389 "compare_and_write": false, 00:20:37.389 "abort": true, 00:20:37.389 "seek_hole": false, 00:20:37.389 "seek_data": false, 00:20:37.389 "copy": true, 00:20:37.389 "nvme_iov_md": false 00:20:37.389 }, 00:20:37.389 "memory_domains": [ 00:20:37.389 { 00:20:37.389 "dma_device_id": "system", 00:20:37.389 "dma_device_type": 1 00:20:37.389 }, 00:20:37.389 { 00:20:37.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.389 "dma_device_type": 2 00:20:37.389 } 00:20:37.389 ], 00:20:37.389 "driver_specific": { 00:20:37.389 "passthru": { 00:20:37.389 "name": "pt3", 00:20:37.389 "base_bdev_name": "malloc3" 00:20:37.389 } 00:20:37.389 } 00:20:37.389 }' 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.389 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:37.648 15:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:37.907 "name": "pt4", 00:20:37.907 "aliases": [ 00:20:37.907 "00000000-0000-0000-0000-000000000004" 00:20:37.907 ], 00:20:37.907 "product_name": "passthru", 00:20:37.907 "block_size": 512, 00:20:37.907 "num_blocks": 65536, 00:20:37.907 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:37.907 "assigned_rate_limits": { 00:20:37.907 "rw_ios_per_sec": 0, 00:20:37.907 "rw_mbytes_per_sec": 0, 00:20:37.907 "r_mbytes_per_sec": 0, 00:20:37.907 "w_mbytes_per_sec": 0 00:20:37.907 }, 00:20:37.907 "claimed": true, 00:20:37.907 "claim_type": "exclusive_write", 00:20:37.907 "zoned": false, 00:20:37.907 "supported_io_types": { 00:20:37.907 "read": true, 00:20:37.907 "write": true, 00:20:37.907 "unmap": true, 00:20:37.907 "flush": true, 00:20:37.907 "reset": true, 00:20:37.907 "nvme_admin": false, 00:20:37.907 "nvme_io": false, 00:20:37.907 "nvme_io_md": false, 00:20:37.907 "write_zeroes": true, 00:20:37.907 "zcopy": true, 00:20:37.907 "get_zone_info": false, 00:20:37.907 "zone_management": false, 00:20:37.907 "zone_append": false, 00:20:37.907 "compare": false, 00:20:37.907 "compare_and_write": false, 00:20:37.907 "abort": true, 00:20:37.907 "seek_hole": false, 00:20:37.907 "seek_data": false, 00:20:37.907 "copy": true, 00:20:37.907 "nvme_iov_md": false 00:20:37.907 }, 00:20:37.907 "memory_domains": [ 00:20:37.907 { 00:20:37.907 "dma_device_id": "system", 00:20:37.907 "dma_device_type": 1 00:20:37.907 }, 00:20:37.907 { 00:20:37.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.907 "dma_device_type": 2 00:20:37.907 } 00:20:37.907 ], 00:20:37.907 "driver_specific": { 00:20:37.907 "passthru": { 00:20:37.907 "name": "pt4", 00:20:37.907 "base_bdev_name": "malloc4" 00:20:37.907 } 00:20:37.907 } 00:20:37.907 }' 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:37.907 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:38.166 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:38.426 [2024-07-12 15:56:58.702529] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2c556e7d-c7e6-4a0a-b0be-9d37a812a99f '!=' 2c556e7d-c7e6-4a0a-b0be-9d37a812a99f ']' 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2595495 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2595495 ']' 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2595495 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2595495 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2595495' 00:20:38.426 killing process with pid 2595495 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2595495 00:20:38.426 [2024-07-12 15:56:58.792570] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:38.426 [2024-07-12 15:56:58.792616] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:38.426 [2024-07-12 15:56:58.792666] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:38.426 [2024-07-12 15:56:58.792674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151a140 name raid_bdev1, state offline 00:20:38.426 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2595495 00:20:38.426 [2024-07-12 15:56:58.813503] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:38.685 15:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:38.685 00:20:38.685 real 0m14.205s 00:20:38.685 user 0m26.190s 00:20:38.685 sys 0m2.075s 00:20:38.685 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:38.685 15:56:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.685 ************************************ 00:20:38.685 END TEST raid_superblock_test 00:20:38.685 ************************************ 00:20:38.685 15:56:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:38.685 15:56:58 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:38.685 15:56:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:38.685 15:56:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:38.685 15:56:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:38.685 ************************************ 00:20:38.685 START TEST raid_read_error_test 00:20:38.685 ************************************ 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:38.685 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ql7VopGlKZ 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2598239 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2598239 /var/tmp/spdk-raid.sock 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2598239 ']' 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:38.686 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:38.686 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.686 [2024-07-12 15:56:59.098800] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:20:38.686 [2024-07-12 15:56:59.098861] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598239 ] 00:20:38.945 [2024-07-12 15:56:59.185519] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:38.945 [2024-07-12 15:56:59.249626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:38.945 [2024-07-12 15:56:59.292978] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:38.945 [2024-07-12 15:56:59.293017] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:39.515 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:39.515 15:56:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:39.515 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:39.515 15:56:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:39.774 BaseBdev1_malloc 00:20:39.774 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:40.034 true 00:20:40.034 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:40.034 [2024-07-12 15:57:00.428049] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:40.034 [2024-07-12 15:57:00.428082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.034 [2024-07-12 15:57:00.428093] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f16aa0 00:20:40.034 [2024-07-12 15:57:00.428099] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.034 [2024-07-12 15:57:00.429323] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.034 [2024-07-12 15:57:00.429342] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:40.034 BaseBdev1 00:20:40.034 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:40.034 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:40.293 BaseBdev2_malloc 00:20:40.293 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:40.553 true 00:20:40.553 15:57:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:40.813 [2024-07-12 15:57:01.027450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:40.813 [2024-07-12 15:57:01.027481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.813 [2024-07-12 15:57:01.027492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1be40 00:20:40.813 [2024-07-12 15:57:01.027499] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.813 [2024-07-12 15:57:01.028722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.813 [2024-07-12 15:57:01.028742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:40.813 BaseBdev2 00:20:40.813 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:40.813 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:40.813 BaseBdev3_malloc 00:20:40.813 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:41.072 true 00:20:41.072 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:41.360 [2024-07-12 15:57:01.598833] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:41.360 [2024-07-12 15:57:01.598861] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.360 [2024-07-12 15:57:01.598874] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1d7f0 00:20:41.360 [2024-07-12 15:57:01.598881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.360 [2024-07-12 15:57:01.600066] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.360 [2024-07-12 15:57:01.600085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:41.360 BaseBdev3 00:20:41.360 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:41.360 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:41.633 BaseBdev4_malloc 00:20:41.633 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:41.633 true 00:20:41.633 15:57:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:41.893 [2024-07-12 15:57:02.174192] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:41.893 [2024-07-12 15:57:02.174221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:41.893 [2024-07-12 15:57:02.174233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f1b8b0 00:20:41.893 [2024-07-12 15:57:02.174239] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:41.893 [2024-07-12 15:57:02.175438] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:41.893 [2024-07-12 15:57:02.175458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:41.893 BaseBdev4 00:20:41.893 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:42.152 [2024-07-12 15:57:02.362691] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:42.152 [2024-07-12 15:57:02.363716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:42.152 [2024-07-12 15:57:02.363769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:42.152 [2024-07-12 15:57:02.363820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:42.152 [2024-07-12 15:57:02.363997] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f1f290 00:20:42.152 [2024-07-12 15:57:02.364004] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:42.152 [2024-07-12 15:57:02.364150] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f1f7e0 00:20:42.152 [2024-07-12 15:57:02.364266] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f1f290 00:20:42.152 [2024-07-12 15:57:02.364271] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f1f290 00:20:42.152 [2024-07-12 15:57:02.364345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:42.152 "name": "raid_bdev1", 00:20:42.152 "uuid": "e58e8229-326a-40ba-9d39-351085f4bf04", 00:20:42.152 "strip_size_kb": 64, 00:20:42.152 "state": "online", 00:20:42.152 "raid_level": "concat", 00:20:42.152 "superblock": true, 00:20:42.152 "num_base_bdevs": 4, 00:20:42.152 "num_base_bdevs_discovered": 4, 00:20:42.152 "num_base_bdevs_operational": 4, 00:20:42.152 "base_bdevs_list": [ 00:20:42.152 { 00:20:42.152 "name": "BaseBdev1", 00:20:42.152 "uuid": "2ffbbbf9-6e5f-50f0-887a-684618f4a898", 00:20:42.152 "is_configured": true, 00:20:42.152 "data_offset": 2048, 00:20:42.152 "data_size": 63488 00:20:42.152 }, 00:20:42.152 { 00:20:42.152 "name": "BaseBdev2", 00:20:42.152 "uuid": "fa082e86-17d6-5fb6-a224-8d44969313a3", 00:20:42.152 "is_configured": true, 00:20:42.152 "data_offset": 2048, 00:20:42.152 "data_size": 63488 00:20:42.152 }, 00:20:42.152 { 00:20:42.152 "name": "BaseBdev3", 00:20:42.152 "uuid": "669cc557-8194-5cae-90ca-3f2b8df0714b", 00:20:42.152 "is_configured": true, 00:20:42.152 "data_offset": 2048, 00:20:42.152 "data_size": 63488 00:20:42.152 }, 00:20:42.152 { 00:20:42.152 "name": "BaseBdev4", 00:20:42.152 "uuid": "684789a0-2982-584b-9d33-bfe43453ac6f", 00:20:42.152 "is_configured": true, 00:20:42.152 "data_offset": 2048, 00:20:42.152 "data_size": 63488 00:20:42.152 } 00:20:42.152 ] 00:20:42.152 }' 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:42.152 15:57:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:42.728 15:57:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:42.728 15:57:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:42.728 [2024-07-12 15:57:03.164941] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f23cc0 00:20:43.667 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:43.927 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:44.187 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.187 "name": "raid_bdev1", 00:20:44.187 "uuid": "e58e8229-326a-40ba-9d39-351085f4bf04", 00:20:44.187 "strip_size_kb": 64, 00:20:44.187 "state": "online", 00:20:44.187 "raid_level": "concat", 00:20:44.187 "superblock": true, 00:20:44.187 "num_base_bdevs": 4, 00:20:44.187 "num_base_bdevs_discovered": 4, 00:20:44.187 "num_base_bdevs_operational": 4, 00:20:44.187 "base_bdevs_list": [ 00:20:44.187 { 00:20:44.187 "name": "BaseBdev1", 00:20:44.187 "uuid": "2ffbbbf9-6e5f-50f0-887a-684618f4a898", 00:20:44.187 "is_configured": true, 00:20:44.187 "data_offset": 2048, 00:20:44.187 "data_size": 63488 00:20:44.187 }, 00:20:44.187 { 00:20:44.187 "name": "BaseBdev2", 00:20:44.187 "uuid": "fa082e86-17d6-5fb6-a224-8d44969313a3", 00:20:44.187 "is_configured": true, 00:20:44.187 "data_offset": 2048, 00:20:44.187 "data_size": 63488 00:20:44.187 }, 00:20:44.187 { 00:20:44.187 "name": "BaseBdev3", 00:20:44.187 "uuid": "669cc557-8194-5cae-90ca-3f2b8df0714b", 00:20:44.187 "is_configured": true, 00:20:44.187 "data_offset": 2048, 00:20:44.187 "data_size": 63488 00:20:44.187 }, 00:20:44.187 { 00:20:44.187 "name": "BaseBdev4", 00:20:44.187 "uuid": "684789a0-2982-584b-9d33-bfe43453ac6f", 00:20:44.187 "is_configured": true, 00:20:44.187 "data_offset": 2048, 00:20:44.187 "data_size": 63488 00:20:44.187 } 00:20:44.187 ] 00:20:44.187 }' 00:20:44.187 15:57:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.187 15:57:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.756 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:44.756 [2024-07-12 15:57:05.191756] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:44.756 [2024-07-12 15:57:05.191786] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:44.756 [2024-07-12 15:57:05.194378] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:44.756 [2024-07-12 15:57:05.194406] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.756 [2024-07-12 15:57:05.194436] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:44.756 [2024-07-12 15:57:05.194442] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f1f290 name raid_bdev1, state offline 00:20:44.756 0 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2598239 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2598239 ']' 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2598239 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2598239 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2598239' 00:20:45.017 killing process with pid 2598239 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2598239 00:20:45.017 [2024-07-12 15:57:05.251229] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2598239 00:20:45.017 [2024-07-12 15:57:05.268507] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ql7VopGlKZ 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:20:45.017 00:20:45.017 real 0m6.383s 00:20:45.017 user 0m10.241s 00:20:45.017 sys 0m0.935s 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:45.017 15:57:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.017 ************************************ 00:20:45.017 END TEST raid_read_error_test 00:20:45.017 ************************************ 00:20:45.017 15:57:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:45.017 15:57:05 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:45.017 15:57:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:45.017 15:57:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:45.017 15:57:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:45.277 ************************************ 00:20:45.277 START TEST raid_write_error_test 00:20:45.277 ************************************ 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:45.277 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.U9lcovse4O 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2599411 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2599411 /var/tmp/spdk-raid.sock 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2599411 ']' 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:45.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:45.278 15:57:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.278 [2024-07-12 15:57:05.548049] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:20:45.278 [2024-07-12 15:57:05.548104] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599411 ] 00:20:45.278 [2024-07-12 15:57:05.639704] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:45.278 [2024-07-12 15:57:05.717262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:45.538 [2024-07-12 15:57:05.764102] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:45.538 [2024-07-12 15:57:05.764129] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:46.109 15:57:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:46.109 15:57:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:46.109 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:46.109 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:46.370 BaseBdev1_malloc 00:20:46.370 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:46.370 true 00:20:46.370 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:46.630 [2024-07-12 15:57:06.944100] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:46.630 [2024-07-12 15:57:06.944139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.630 [2024-07-12 15:57:06.944150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18d9aa0 00:20:46.630 [2024-07-12 15:57:06.944156] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.630 [2024-07-12 15:57:06.945369] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.630 [2024-07-12 15:57:06.945388] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:46.630 BaseBdev1 00:20:46.630 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:46.630 15:57:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:46.890 BaseBdev2_malloc 00:20:46.890 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:46.890 true 00:20:46.890 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:47.149 [2024-07-12 15:57:07.495086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:47.149 [2024-07-12 15:57:07.495113] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.149 [2024-07-12 15:57:07.495123] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18dee40 00:20:47.149 [2024-07-12 15:57:07.495129] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.150 [2024-07-12 15:57:07.496267] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.150 [2024-07-12 15:57:07.496285] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:47.150 BaseBdev2 00:20:47.150 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:47.150 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:47.409 BaseBdev3_malloc 00:20:47.409 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:47.670 true 00:20:47.670 15:57:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:47.670 [2024-07-12 15:57:08.026046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:47.670 [2024-07-12 15:57:08.026072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.670 [2024-07-12 15:57:08.026083] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e07f0 00:20:47.670 [2024-07-12 15:57:08.026089] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.670 [2024-07-12 15:57:08.027223] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.670 [2024-07-12 15:57:08.027242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:47.670 BaseBdev3 00:20:47.670 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:47.670 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:47.930 BaseBdev4_malloc 00:20:47.930 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:48.190 true 00:20:48.190 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:48.190 [2024-07-12 15:57:08.593152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:48.190 [2024-07-12 15:57:08.593179] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.190 [2024-07-12 15:57:08.593190] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18de8b0 00:20:48.190 [2024-07-12 15:57:08.593196] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.190 [2024-07-12 15:57:08.594333] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.190 [2024-07-12 15:57:08.594352] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:48.190 BaseBdev4 00:20:48.190 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:48.451 [2024-07-12 15:57:08.777644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:48.451 [2024-07-12 15:57:08.778650] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:48.451 [2024-07-12 15:57:08.778702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:48.451 [2024-07-12 15:57:08.778754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:48.451 [2024-07-12 15:57:08.778931] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18e2290 00:20:48.451 [2024-07-12 15:57:08.778938] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:48.451 [2024-07-12 15:57:08.779074] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e27e0 00:20:48.451 [2024-07-12 15:57:08.779188] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18e2290 00:20:48.451 [2024-07-12 15:57:08.779194] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18e2290 00:20:48.451 [2024-07-12 15:57:08.779266] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.451 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.710 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.710 "name": "raid_bdev1", 00:20:48.710 "uuid": "bc354ab8-a1ff-4a5f-9b71-2feff30668d5", 00:20:48.710 "strip_size_kb": 64, 00:20:48.710 "state": "online", 00:20:48.710 "raid_level": "concat", 00:20:48.710 "superblock": true, 00:20:48.710 "num_base_bdevs": 4, 00:20:48.710 "num_base_bdevs_discovered": 4, 00:20:48.710 "num_base_bdevs_operational": 4, 00:20:48.710 "base_bdevs_list": [ 00:20:48.710 { 00:20:48.710 "name": "BaseBdev1", 00:20:48.710 "uuid": "6e3f60f5-74b7-5fd4-bd13-0cc868adfefc", 00:20:48.710 "is_configured": true, 00:20:48.710 "data_offset": 2048, 00:20:48.710 "data_size": 63488 00:20:48.710 }, 00:20:48.710 { 00:20:48.710 "name": "BaseBdev2", 00:20:48.710 "uuid": "5d881e0d-2f97-5da8-a5ca-3b3d9f8fdf13", 00:20:48.710 "is_configured": true, 00:20:48.710 "data_offset": 2048, 00:20:48.710 "data_size": 63488 00:20:48.710 }, 00:20:48.710 { 00:20:48.710 "name": "BaseBdev3", 00:20:48.710 "uuid": "3ac19ea0-ec29-58a8-9b59-37c3974791a9", 00:20:48.710 "is_configured": true, 00:20:48.710 "data_offset": 2048, 00:20:48.710 "data_size": 63488 00:20:48.710 }, 00:20:48.710 { 00:20:48.710 "name": "BaseBdev4", 00:20:48.710 "uuid": "e46e6790-4d45-5d6a-b659-25bd7b972fa6", 00:20:48.710 "is_configured": true, 00:20:48.710 "data_offset": 2048, 00:20:48.710 "data_size": 63488 00:20:48.710 } 00:20:48.710 ] 00:20:48.710 }' 00:20:48.710 15:57:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.710 15:57:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.280 15:57:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:49.280 15:57:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:49.280 [2024-07-12 15:57:09.636037] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e6cc0 00:20:50.219 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.479 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:50.739 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.739 "name": "raid_bdev1", 00:20:50.739 "uuid": "bc354ab8-a1ff-4a5f-9b71-2feff30668d5", 00:20:50.739 "strip_size_kb": 64, 00:20:50.739 "state": "online", 00:20:50.739 "raid_level": "concat", 00:20:50.739 "superblock": true, 00:20:50.739 "num_base_bdevs": 4, 00:20:50.739 "num_base_bdevs_discovered": 4, 00:20:50.739 "num_base_bdevs_operational": 4, 00:20:50.739 "base_bdevs_list": [ 00:20:50.739 { 00:20:50.739 "name": "BaseBdev1", 00:20:50.739 "uuid": "6e3f60f5-74b7-5fd4-bd13-0cc868adfefc", 00:20:50.739 "is_configured": true, 00:20:50.739 "data_offset": 2048, 00:20:50.739 "data_size": 63488 00:20:50.739 }, 00:20:50.739 { 00:20:50.739 "name": "BaseBdev2", 00:20:50.739 "uuid": "5d881e0d-2f97-5da8-a5ca-3b3d9f8fdf13", 00:20:50.739 "is_configured": true, 00:20:50.739 "data_offset": 2048, 00:20:50.739 "data_size": 63488 00:20:50.739 }, 00:20:50.739 { 00:20:50.739 "name": "BaseBdev3", 00:20:50.739 "uuid": "3ac19ea0-ec29-58a8-9b59-37c3974791a9", 00:20:50.739 "is_configured": true, 00:20:50.739 "data_offset": 2048, 00:20:50.739 "data_size": 63488 00:20:50.739 }, 00:20:50.739 { 00:20:50.739 "name": "BaseBdev4", 00:20:50.739 "uuid": "e46e6790-4d45-5d6a-b659-25bd7b972fa6", 00:20:50.739 "is_configured": true, 00:20:50.739 "data_offset": 2048, 00:20:50.739 "data_size": 63488 00:20:50.739 } 00:20:50.739 ] 00:20:50.739 }' 00:20:50.739 15:57:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.739 15:57:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:51.308 [2024-07-12 15:57:11.675965] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:51.308 [2024-07-12 15:57:11.675995] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:51.308 [2024-07-12 15:57:11.678654] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:51.308 [2024-07-12 15:57:11.678683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:51.308 [2024-07-12 15:57:11.678719] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:51.308 [2024-07-12 15:57:11.678725] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18e2290 name raid_bdev1, state offline 00:20:51.308 0 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2599411 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2599411 ']' 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2599411 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:51.308 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2599411 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2599411' 00:20:51.569 killing process with pid 2599411 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2599411 00:20:51.569 [2024-07-12 15:57:11.764633] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2599411 00:20:51.569 [2024-07-12 15:57:11.781897] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.U9lcovse4O 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:20:51.569 00:20:51.569 real 0m6.443s 00:20:51.569 user 0m10.343s 00:20:51.569 sys 0m0.950s 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:51.569 15:57:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.569 ************************************ 00:20:51.569 END TEST raid_write_error_test 00:20:51.569 ************************************ 00:20:51.569 15:57:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:51.569 15:57:11 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:51.569 15:57:11 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:51.569 15:57:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:51.569 15:57:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:51.569 15:57:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:51.569 ************************************ 00:20:51.569 START TEST raid_state_function_test 00:20:51.569 ************************************ 00:20:51.569 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:20:51.569 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:51.569 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:51.569 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:51.569 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2601158 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2601158' 00:20:51.569 Process raid pid: 2601158 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2601158 /var/tmp/spdk-raid.sock 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 2601158 ']' 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:51.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:51.569 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:51.570 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.829 [2024-07-12 15:57:12.060089] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:20:51.829 [2024-07-12 15:57:12.060137] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:51.829 [2024-07-12 15:57:12.147040] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:51.829 [2024-07-12 15:57:12.211027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:51.829 [2024-07-12 15:57:12.249599] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:51.829 [2024-07-12 15:57:12.249620] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:52.768 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:52.768 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:52.768 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:52.768 [2024-07-12 15:57:13.052797] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:52.768 [2024-07-12 15:57:13.052824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:52.768 [2024-07-12 15:57:13.052830] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:52.768 [2024-07-12 15:57:13.052837] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:52.768 [2024-07-12 15:57:13.052841] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:52.768 [2024-07-12 15:57:13.052847] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:52.768 [2024-07-12 15:57:13.052852] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:52.768 [2024-07-12 15:57:13.052857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.768 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.029 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.029 "name": "Existed_Raid", 00:20:53.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.029 "strip_size_kb": 0, 00:20:53.029 "state": "configuring", 00:20:53.029 "raid_level": "raid1", 00:20:53.029 "superblock": false, 00:20:53.029 "num_base_bdevs": 4, 00:20:53.029 "num_base_bdevs_discovered": 0, 00:20:53.029 "num_base_bdevs_operational": 4, 00:20:53.029 "base_bdevs_list": [ 00:20:53.029 { 00:20:53.029 "name": "BaseBdev1", 00:20:53.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.029 "is_configured": false, 00:20:53.029 "data_offset": 0, 00:20:53.029 "data_size": 0 00:20:53.029 }, 00:20:53.029 { 00:20:53.029 "name": "BaseBdev2", 00:20:53.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.029 "is_configured": false, 00:20:53.029 "data_offset": 0, 00:20:53.029 "data_size": 0 00:20:53.029 }, 00:20:53.029 { 00:20:53.029 "name": "BaseBdev3", 00:20:53.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.029 "is_configured": false, 00:20:53.029 "data_offset": 0, 00:20:53.029 "data_size": 0 00:20:53.029 }, 00:20:53.029 { 00:20:53.029 "name": "BaseBdev4", 00:20:53.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:53.029 "is_configured": false, 00:20:53.029 "data_offset": 0, 00:20:53.029 "data_size": 0 00:20:53.029 } 00:20:53.029 ] 00:20:53.029 }' 00:20:53.029 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.029 15:57:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.599 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:53.599 [2024-07-12 15:57:14.003085] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:53.599 [2024-07-12 15:57:14.003102] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x272e920 name Existed_Raid, state configuring 00:20:53.599 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:53.859 [2024-07-12 15:57:14.199597] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:53.859 [2024-07-12 15:57:14.199615] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:53.859 [2024-07-12 15:57:14.199620] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:53.859 [2024-07-12 15:57:14.199625] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:53.859 [2024-07-12 15:57:14.199630] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:53.859 [2024-07-12 15:57:14.199636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:53.859 [2024-07-12 15:57:14.199641] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:53.859 [2024-07-12 15:57:14.199646] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:53.859 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:54.119 [2024-07-12 15:57:14.390423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:54.119 BaseBdev1 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:54.119 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:54.380 [ 00:20:54.380 { 00:20:54.380 "name": "BaseBdev1", 00:20:54.380 "aliases": [ 00:20:54.380 "2953b373-b42f-49d2-8f5d-aab605eb00ac" 00:20:54.380 ], 00:20:54.380 "product_name": "Malloc disk", 00:20:54.380 "block_size": 512, 00:20:54.380 "num_blocks": 65536, 00:20:54.380 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:54.380 "assigned_rate_limits": { 00:20:54.380 "rw_ios_per_sec": 0, 00:20:54.380 "rw_mbytes_per_sec": 0, 00:20:54.380 "r_mbytes_per_sec": 0, 00:20:54.380 "w_mbytes_per_sec": 0 00:20:54.380 }, 00:20:54.380 "claimed": true, 00:20:54.380 "claim_type": "exclusive_write", 00:20:54.380 "zoned": false, 00:20:54.380 "supported_io_types": { 00:20:54.380 "read": true, 00:20:54.380 "write": true, 00:20:54.380 "unmap": true, 00:20:54.380 "flush": true, 00:20:54.380 "reset": true, 00:20:54.380 "nvme_admin": false, 00:20:54.380 "nvme_io": false, 00:20:54.380 "nvme_io_md": false, 00:20:54.380 "write_zeroes": true, 00:20:54.380 "zcopy": true, 00:20:54.380 "get_zone_info": false, 00:20:54.380 "zone_management": false, 00:20:54.380 "zone_append": false, 00:20:54.380 "compare": false, 00:20:54.380 "compare_and_write": false, 00:20:54.380 "abort": true, 00:20:54.380 "seek_hole": false, 00:20:54.380 "seek_data": false, 00:20:54.380 "copy": true, 00:20:54.380 "nvme_iov_md": false 00:20:54.380 }, 00:20:54.380 "memory_domains": [ 00:20:54.380 { 00:20:54.380 "dma_device_id": "system", 00:20:54.380 "dma_device_type": 1 00:20:54.380 }, 00:20:54.380 { 00:20:54.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.380 "dma_device_type": 2 00:20:54.380 } 00:20:54.380 ], 00:20:54.380 "driver_specific": {} 00:20:54.380 } 00:20:54.380 ] 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.380 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.639 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.639 "name": "Existed_Raid", 00:20:54.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.639 "strip_size_kb": 0, 00:20:54.639 "state": "configuring", 00:20:54.639 "raid_level": "raid1", 00:20:54.639 "superblock": false, 00:20:54.639 "num_base_bdevs": 4, 00:20:54.639 "num_base_bdevs_discovered": 1, 00:20:54.639 "num_base_bdevs_operational": 4, 00:20:54.639 "base_bdevs_list": [ 00:20:54.639 { 00:20:54.639 "name": "BaseBdev1", 00:20:54.639 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:54.639 "is_configured": true, 00:20:54.639 "data_offset": 0, 00:20:54.639 "data_size": 65536 00:20:54.639 }, 00:20:54.639 { 00:20:54.639 "name": "BaseBdev2", 00:20:54.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.639 "is_configured": false, 00:20:54.639 "data_offset": 0, 00:20:54.639 "data_size": 0 00:20:54.639 }, 00:20:54.639 { 00:20:54.639 "name": "BaseBdev3", 00:20:54.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.639 "is_configured": false, 00:20:54.639 "data_offset": 0, 00:20:54.639 "data_size": 0 00:20:54.639 }, 00:20:54.639 { 00:20:54.639 "name": "BaseBdev4", 00:20:54.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.639 "is_configured": false, 00:20:54.639 "data_offset": 0, 00:20:54.639 "data_size": 0 00:20:54.639 } 00:20:54.639 ] 00:20:54.639 }' 00:20:54.640 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.640 15:57:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:55.208 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:55.467 [2024-07-12 15:57:15.701724] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:55.467 [2024-07-12 15:57:15.701750] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x272e190 name Existed_Raid, state configuring 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:55.467 [2024-07-12 15:57:15.890224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.467 [2024-07-12 15:57:15.891337] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:55.467 [2024-07-12 15:57:15.891364] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:55.467 [2024-07-12 15:57:15.891370] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:55.467 [2024-07-12 15:57:15.891376] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:55.467 [2024-07-12 15:57:15.891381] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:55.467 [2024-07-12 15:57:15.891386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:55.467 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.726 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.726 "name": "Existed_Raid", 00:20:55.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.726 "strip_size_kb": 0, 00:20:55.726 "state": "configuring", 00:20:55.726 "raid_level": "raid1", 00:20:55.726 "superblock": false, 00:20:55.726 "num_base_bdevs": 4, 00:20:55.726 "num_base_bdevs_discovered": 1, 00:20:55.726 "num_base_bdevs_operational": 4, 00:20:55.726 "base_bdevs_list": [ 00:20:55.726 { 00:20:55.726 "name": "BaseBdev1", 00:20:55.726 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:55.726 "is_configured": true, 00:20:55.726 "data_offset": 0, 00:20:55.726 "data_size": 65536 00:20:55.726 }, 00:20:55.726 { 00:20:55.726 "name": "BaseBdev2", 00:20:55.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.726 "is_configured": false, 00:20:55.726 "data_offset": 0, 00:20:55.726 "data_size": 0 00:20:55.726 }, 00:20:55.726 { 00:20:55.726 "name": "BaseBdev3", 00:20:55.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.726 "is_configured": false, 00:20:55.726 "data_offset": 0, 00:20:55.726 "data_size": 0 00:20:55.726 }, 00:20:55.726 { 00:20:55.726 "name": "BaseBdev4", 00:20:55.726 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:55.726 "is_configured": false, 00:20:55.726 "data_offset": 0, 00:20:55.726 "data_size": 0 00:20:55.726 } 00:20:55.726 ] 00:20:55.726 }' 00:20:55.726 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.726 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.294 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:56.554 [2024-07-12 15:57:16.801372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:56.554 BaseBdev2 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:56.554 15:57:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:56.814 [ 00:20:56.814 { 00:20:56.814 "name": "BaseBdev2", 00:20:56.814 "aliases": [ 00:20:56.814 "a462888b-a70a-42d1-b58b-11ac4478f90c" 00:20:56.814 ], 00:20:56.814 "product_name": "Malloc disk", 00:20:56.814 "block_size": 512, 00:20:56.814 "num_blocks": 65536, 00:20:56.814 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:20:56.814 "assigned_rate_limits": { 00:20:56.814 "rw_ios_per_sec": 0, 00:20:56.814 "rw_mbytes_per_sec": 0, 00:20:56.814 "r_mbytes_per_sec": 0, 00:20:56.814 "w_mbytes_per_sec": 0 00:20:56.814 }, 00:20:56.814 "claimed": true, 00:20:56.814 "claim_type": "exclusive_write", 00:20:56.814 "zoned": false, 00:20:56.814 "supported_io_types": { 00:20:56.814 "read": true, 00:20:56.814 "write": true, 00:20:56.814 "unmap": true, 00:20:56.814 "flush": true, 00:20:56.814 "reset": true, 00:20:56.814 "nvme_admin": false, 00:20:56.814 "nvme_io": false, 00:20:56.814 "nvme_io_md": false, 00:20:56.814 "write_zeroes": true, 00:20:56.814 "zcopy": true, 00:20:56.814 "get_zone_info": false, 00:20:56.814 "zone_management": false, 00:20:56.814 "zone_append": false, 00:20:56.814 "compare": false, 00:20:56.814 "compare_and_write": false, 00:20:56.814 "abort": true, 00:20:56.814 "seek_hole": false, 00:20:56.814 "seek_data": false, 00:20:56.814 "copy": true, 00:20:56.814 "nvme_iov_md": false 00:20:56.814 }, 00:20:56.814 "memory_domains": [ 00:20:56.814 { 00:20:56.814 "dma_device_id": "system", 00:20:56.814 "dma_device_type": 1 00:20:56.814 }, 00:20:56.814 { 00:20:56.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.814 "dma_device_type": 2 00:20:56.814 } 00:20:56.814 ], 00:20:56.814 "driver_specific": {} 00:20:56.814 } 00:20:56.814 ] 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.814 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.074 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.074 "name": "Existed_Raid", 00:20:57.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.074 "strip_size_kb": 0, 00:20:57.074 "state": "configuring", 00:20:57.074 "raid_level": "raid1", 00:20:57.074 "superblock": false, 00:20:57.074 "num_base_bdevs": 4, 00:20:57.074 "num_base_bdevs_discovered": 2, 00:20:57.074 "num_base_bdevs_operational": 4, 00:20:57.074 "base_bdevs_list": [ 00:20:57.074 { 00:20:57.074 "name": "BaseBdev1", 00:20:57.074 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:57.074 "is_configured": true, 00:20:57.074 "data_offset": 0, 00:20:57.074 "data_size": 65536 00:20:57.074 }, 00:20:57.074 { 00:20:57.074 "name": "BaseBdev2", 00:20:57.074 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:20:57.074 "is_configured": true, 00:20:57.074 "data_offset": 0, 00:20:57.074 "data_size": 65536 00:20:57.074 }, 00:20:57.074 { 00:20:57.074 "name": "BaseBdev3", 00:20:57.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.074 "is_configured": false, 00:20:57.074 "data_offset": 0, 00:20:57.074 "data_size": 0 00:20:57.074 }, 00:20:57.074 { 00:20:57.074 "name": "BaseBdev4", 00:20:57.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.074 "is_configured": false, 00:20:57.074 "data_offset": 0, 00:20:57.074 "data_size": 0 00:20:57.074 } 00:20:57.074 ] 00:20:57.074 }' 00:20:57.074 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.074 15:57:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:57.643 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:57.643 [2024-07-12 15:57:18.073396] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:57.643 BaseBdev3 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:57.643 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:57.903 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:58.164 [ 00:20:58.164 { 00:20:58.164 "name": "BaseBdev3", 00:20:58.164 "aliases": [ 00:20:58.164 "c707694e-5df0-4f7d-bad7-5dcca759ca66" 00:20:58.164 ], 00:20:58.164 "product_name": "Malloc disk", 00:20:58.164 "block_size": 512, 00:20:58.164 "num_blocks": 65536, 00:20:58.164 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:20:58.164 "assigned_rate_limits": { 00:20:58.164 "rw_ios_per_sec": 0, 00:20:58.164 "rw_mbytes_per_sec": 0, 00:20:58.164 "r_mbytes_per_sec": 0, 00:20:58.164 "w_mbytes_per_sec": 0 00:20:58.164 }, 00:20:58.164 "claimed": true, 00:20:58.164 "claim_type": "exclusive_write", 00:20:58.164 "zoned": false, 00:20:58.164 "supported_io_types": { 00:20:58.164 "read": true, 00:20:58.164 "write": true, 00:20:58.164 "unmap": true, 00:20:58.164 "flush": true, 00:20:58.164 "reset": true, 00:20:58.164 "nvme_admin": false, 00:20:58.164 "nvme_io": false, 00:20:58.164 "nvme_io_md": false, 00:20:58.164 "write_zeroes": true, 00:20:58.164 "zcopy": true, 00:20:58.164 "get_zone_info": false, 00:20:58.164 "zone_management": false, 00:20:58.164 "zone_append": false, 00:20:58.164 "compare": false, 00:20:58.164 "compare_and_write": false, 00:20:58.164 "abort": true, 00:20:58.164 "seek_hole": false, 00:20:58.164 "seek_data": false, 00:20:58.164 "copy": true, 00:20:58.164 "nvme_iov_md": false 00:20:58.164 }, 00:20:58.164 "memory_domains": [ 00:20:58.164 { 00:20:58.164 "dma_device_id": "system", 00:20:58.164 "dma_device_type": 1 00:20:58.164 }, 00:20:58.164 { 00:20:58.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.164 "dma_device_type": 2 00:20:58.164 } 00:20:58.164 ], 00:20:58.164 "driver_specific": {} 00:20:58.164 } 00:20:58.164 ] 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.164 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.424 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.424 "name": "Existed_Raid", 00:20:58.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.424 "strip_size_kb": 0, 00:20:58.424 "state": "configuring", 00:20:58.424 "raid_level": "raid1", 00:20:58.424 "superblock": false, 00:20:58.424 "num_base_bdevs": 4, 00:20:58.424 "num_base_bdevs_discovered": 3, 00:20:58.424 "num_base_bdevs_operational": 4, 00:20:58.424 "base_bdevs_list": [ 00:20:58.424 { 00:20:58.424 "name": "BaseBdev1", 00:20:58.424 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:58.424 "is_configured": true, 00:20:58.424 "data_offset": 0, 00:20:58.424 "data_size": 65536 00:20:58.424 }, 00:20:58.424 { 00:20:58.424 "name": "BaseBdev2", 00:20:58.424 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:20:58.424 "is_configured": true, 00:20:58.424 "data_offset": 0, 00:20:58.424 "data_size": 65536 00:20:58.424 }, 00:20:58.424 { 00:20:58.424 "name": "BaseBdev3", 00:20:58.424 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:20:58.424 "is_configured": true, 00:20:58.424 "data_offset": 0, 00:20:58.424 "data_size": 65536 00:20:58.424 }, 00:20:58.424 { 00:20:58.424 "name": "BaseBdev4", 00:20:58.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.424 "is_configured": false, 00:20:58.424 "data_offset": 0, 00:20:58.424 "data_size": 0 00:20:58.424 } 00:20:58.424 ] 00:20:58.424 }' 00:20:58.424 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.424 15:57:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.993 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:58.993 [2024-07-12 15:57:19.377519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:58.993 [2024-07-12 15:57:19.377545] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x272f1d0 00:20:58.993 [2024-07-12 15:57:19.377549] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:58.993 [2024-07-12 15:57:19.377739] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2730220 00:20:58.993 [2024-07-12 15:57:19.377838] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x272f1d0 00:20:58.993 [2024-07-12 15:57:19.377844] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x272f1d0 00:20:58.993 [2024-07-12 15:57:19.377960] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:58.993 BaseBdev4 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:58.994 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.253 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:59.513 [ 00:20:59.513 { 00:20:59.513 "name": "BaseBdev4", 00:20:59.513 "aliases": [ 00:20:59.513 "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef" 00:20:59.513 ], 00:20:59.513 "product_name": "Malloc disk", 00:20:59.513 "block_size": 512, 00:20:59.513 "num_blocks": 65536, 00:20:59.513 "uuid": "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef", 00:20:59.513 "assigned_rate_limits": { 00:20:59.513 "rw_ios_per_sec": 0, 00:20:59.513 "rw_mbytes_per_sec": 0, 00:20:59.513 "r_mbytes_per_sec": 0, 00:20:59.513 "w_mbytes_per_sec": 0 00:20:59.513 }, 00:20:59.513 "claimed": true, 00:20:59.513 "claim_type": "exclusive_write", 00:20:59.513 "zoned": false, 00:20:59.513 "supported_io_types": { 00:20:59.513 "read": true, 00:20:59.513 "write": true, 00:20:59.513 "unmap": true, 00:20:59.513 "flush": true, 00:20:59.513 "reset": true, 00:20:59.513 "nvme_admin": false, 00:20:59.513 "nvme_io": false, 00:20:59.513 "nvme_io_md": false, 00:20:59.513 "write_zeroes": true, 00:20:59.513 "zcopy": true, 00:20:59.513 "get_zone_info": false, 00:20:59.513 "zone_management": false, 00:20:59.513 "zone_append": false, 00:20:59.513 "compare": false, 00:20:59.513 "compare_and_write": false, 00:20:59.513 "abort": true, 00:20:59.513 "seek_hole": false, 00:20:59.513 "seek_data": false, 00:20:59.513 "copy": true, 00:20:59.513 "nvme_iov_md": false 00:20:59.513 }, 00:20:59.513 "memory_domains": [ 00:20:59.513 { 00:20:59.513 "dma_device_id": "system", 00:20:59.513 "dma_device_type": 1 00:20:59.513 }, 00:20:59.513 { 00:20:59.513 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.513 "dma_device_type": 2 00:20:59.513 } 00:20:59.513 ], 00:20:59.513 "driver_specific": {} 00:20:59.513 } 00:20:59.513 ] 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.513 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.773 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.773 "name": "Existed_Raid", 00:20:59.773 "uuid": "1362d845-732f-470a-9944-451511a8a97f", 00:20:59.773 "strip_size_kb": 0, 00:20:59.773 "state": "online", 00:20:59.773 "raid_level": "raid1", 00:20:59.773 "superblock": false, 00:20:59.773 "num_base_bdevs": 4, 00:20:59.773 "num_base_bdevs_discovered": 4, 00:20:59.773 "num_base_bdevs_operational": 4, 00:20:59.773 "base_bdevs_list": [ 00:20:59.773 { 00:20:59.773 "name": "BaseBdev1", 00:20:59.773 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:20:59.773 "is_configured": true, 00:20:59.773 "data_offset": 0, 00:20:59.773 "data_size": 65536 00:20:59.773 }, 00:20:59.774 { 00:20:59.774 "name": "BaseBdev2", 00:20:59.774 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:20:59.774 "is_configured": true, 00:20:59.774 "data_offset": 0, 00:20:59.774 "data_size": 65536 00:20:59.774 }, 00:20:59.774 { 00:20:59.774 "name": "BaseBdev3", 00:20:59.774 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:20:59.774 "is_configured": true, 00:20:59.774 "data_offset": 0, 00:20:59.774 "data_size": 65536 00:20:59.774 }, 00:20:59.774 { 00:20:59.774 "name": "BaseBdev4", 00:20:59.774 "uuid": "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef", 00:20:59.774 "is_configured": true, 00:20:59.774 "data_offset": 0, 00:20:59.774 "data_size": 65536 00:20:59.774 } 00:20:59.774 ] 00:20:59.774 }' 00:20:59.774 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.774 15:57:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:00.344 [2024-07-12 15:57:20.673067] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:00.344 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:00.344 "name": "Existed_Raid", 00:21:00.344 "aliases": [ 00:21:00.344 "1362d845-732f-470a-9944-451511a8a97f" 00:21:00.344 ], 00:21:00.344 "product_name": "Raid Volume", 00:21:00.344 "block_size": 512, 00:21:00.344 "num_blocks": 65536, 00:21:00.344 "uuid": "1362d845-732f-470a-9944-451511a8a97f", 00:21:00.344 "assigned_rate_limits": { 00:21:00.344 "rw_ios_per_sec": 0, 00:21:00.344 "rw_mbytes_per_sec": 0, 00:21:00.344 "r_mbytes_per_sec": 0, 00:21:00.344 "w_mbytes_per_sec": 0 00:21:00.344 }, 00:21:00.344 "claimed": false, 00:21:00.344 "zoned": false, 00:21:00.344 "supported_io_types": { 00:21:00.344 "read": true, 00:21:00.344 "write": true, 00:21:00.344 "unmap": false, 00:21:00.344 "flush": false, 00:21:00.344 "reset": true, 00:21:00.344 "nvme_admin": false, 00:21:00.344 "nvme_io": false, 00:21:00.345 "nvme_io_md": false, 00:21:00.345 "write_zeroes": true, 00:21:00.345 "zcopy": false, 00:21:00.345 "get_zone_info": false, 00:21:00.345 "zone_management": false, 00:21:00.345 "zone_append": false, 00:21:00.345 "compare": false, 00:21:00.345 "compare_and_write": false, 00:21:00.345 "abort": false, 00:21:00.345 "seek_hole": false, 00:21:00.345 "seek_data": false, 00:21:00.345 "copy": false, 00:21:00.345 "nvme_iov_md": false 00:21:00.345 }, 00:21:00.345 "memory_domains": [ 00:21:00.345 { 00:21:00.345 "dma_device_id": "system", 00:21:00.345 "dma_device_type": 1 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.345 "dma_device_type": 2 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "system", 00:21:00.345 "dma_device_type": 1 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.345 "dma_device_type": 2 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "system", 00:21:00.345 "dma_device_type": 1 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.345 "dma_device_type": 2 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "system", 00:21:00.345 "dma_device_type": 1 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.345 "dma_device_type": 2 00:21:00.345 } 00:21:00.345 ], 00:21:00.345 "driver_specific": { 00:21:00.345 "raid": { 00:21:00.345 "uuid": "1362d845-732f-470a-9944-451511a8a97f", 00:21:00.345 "strip_size_kb": 0, 00:21:00.345 "state": "online", 00:21:00.345 "raid_level": "raid1", 00:21:00.345 "superblock": false, 00:21:00.345 "num_base_bdevs": 4, 00:21:00.345 "num_base_bdevs_discovered": 4, 00:21:00.345 "num_base_bdevs_operational": 4, 00:21:00.345 "base_bdevs_list": [ 00:21:00.345 { 00:21:00.345 "name": "BaseBdev1", 00:21:00.345 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:21:00.345 "is_configured": true, 00:21:00.345 "data_offset": 0, 00:21:00.345 "data_size": 65536 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "name": "BaseBdev2", 00:21:00.345 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:21:00.345 "is_configured": true, 00:21:00.345 "data_offset": 0, 00:21:00.345 "data_size": 65536 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "name": "BaseBdev3", 00:21:00.345 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:21:00.345 "is_configured": true, 00:21:00.345 "data_offset": 0, 00:21:00.345 "data_size": 65536 00:21:00.345 }, 00:21:00.345 { 00:21:00.345 "name": "BaseBdev4", 00:21:00.345 "uuid": "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef", 00:21:00.345 "is_configured": true, 00:21:00.345 "data_offset": 0, 00:21:00.345 "data_size": 65536 00:21:00.345 } 00:21:00.345 ] 00:21:00.345 } 00:21:00.345 } 00:21:00.345 }' 00:21:00.345 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:00.345 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:00.345 BaseBdev2 00:21:00.345 BaseBdev3 00:21:00.345 BaseBdev4' 00:21:00.345 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:00.345 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:00.345 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:00.604 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:00.604 "name": "BaseBdev1", 00:21:00.604 "aliases": [ 00:21:00.604 "2953b373-b42f-49d2-8f5d-aab605eb00ac" 00:21:00.604 ], 00:21:00.604 "product_name": "Malloc disk", 00:21:00.604 "block_size": 512, 00:21:00.604 "num_blocks": 65536, 00:21:00.604 "uuid": "2953b373-b42f-49d2-8f5d-aab605eb00ac", 00:21:00.604 "assigned_rate_limits": { 00:21:00.604 "rw_ios_per_sec": 0, 00:21:00.604 "rw_mbytes_per_sec": 0, 00:21:00.604 "r_mbytes_per_sec": 0, 00:21:00.604 "w_mbytes_per_sec": 0 00:21:00.604 }, 00:21:00.604 "claimed": true, 00:21:00.604 "claim_type": "exclusive_write", 00:21:00.604 "zoned": false, 00:21:00.604 "supported_io_types": { 00:21:00.604 "read": true, 00:21:00.604 "write": true, 00:21:00.604 "unmap": true, 00:21:00.604 "flush": true, 00:21:00.604 "reset": true, 00:21:00.604 "nvme_admin": false, 00:21:00.604 "nvme_io": false, 00:21:00.604 "nvme_io_md": false, 00:21:00.604 "write_zeroes": true, 00:21:00.604 "zcopy": true, 00:21:00.604 "get_zone_info": false, 00:21:00.604 "zone_management": false, 00:21:00.604 "zone_append": false, 00:21:00.604 "compare": false, 00:21:00.604 "compare_and_write": false, 00:21:00.604 "abort": true, 00:21:00.604 "seek_hole": false, 00:21:00.604 "seek_data": false, 00:21:00.604 "copy": true, 00:21:00.604 "nvme_iov_md": false 00:21:00.604 }, 00:21:00.604 "memory_domains": [ 00:21:00.604 { 00:21:00.604 "dma_device_id": "system", 00:21:00.604 "dma_device_type": 1 00:21:00.604 }, 00:21:00.604 { 00:21:00.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.604 "dma_device_type": 2 00:21:00.604 } 00:21:00.604 ], 00:21:00.604 "driver_specific": {} 00:21:00.604 }' 00:21:00.604 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.604 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.604 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:00.604 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:00.863 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.134 "name": "BaseBdev2", 00:21:01.134 "aliases": [ 00:21:01.134 "a462888b-a70a-42d1-b58b-11ac4478f90c" 00:21:01.134 ], 00:21:01.134 "product_name": "Malloc disk", 00:21:01.134 "block_size": 512, 00:21:01.134 "num_blocks": 65536, 00:21:01.134 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:21:01.134 "assigned_rate_limits": { 00:21:01.134 "rw_ios_per_sec": 0, 00:21:01.134 "rw_mbytes_per_sec": 0, 00:21:01.134 "r_mbytes_per_sec": 0, 00:21:01.134 "w_mbytes_per_sec": 0 00:21:01.134 }, 00:21:01.134 "claimed": true, 00:21:01.134 "claim_type": "exclusive_write", 00:21:01.134 "zoned": false, 00:21:01.134 "supported_io_types": { 00:21:01.134 "read": true, 00:21:01.134 "write": true, 00:21:01.134 "unmap": true, 00:21:01.134 "flush": true, 00:21:01.134 "reset": true, 00:21:01.134 "nvme_admin": false, 00:21:01.134 "nvme_io": false, 00:21:01.134 "nvme_io_md": false, 00:21:01.134 "write_zeroes": true, 00:21:01.134 "zcopy": true, 00:21:01.134 "get_zone_info": false, 00:21:01.134 "zone_management": false, 00:21:01.134 "zone_append": false, 00:21:01.134 "compare": false, 00:21:01.134 "compare_and_write": false, 00:21:01.134 "abort": true, 00:21:01.134 "seek_hole": false, 00:21:01.134 "seek_data": false, 00:21:01.134 "copy": true, 00:21:01.134 "nvme_iov_md": false 00:21:01.134 }, 00:21:01.134 "memory_domains": [ 00:21:01.134 { 00:21:01.134 "dma_device_id": "system", 00:21:01.134 "dma_device_type": 1 00:21:01.134 }, 00:21:01.134 { 00:21:01.134 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.134 "dma_device_type": 2 00:21:01.134 } 00:21:01.134 ], 00:21:01.134 "driver_specific": {} 00:21:01.134 }' 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.134 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:01.438 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:01.704 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:01.704 "name": "BaseBdev3", 00:21:01.704 "aliases": [ 00:21:01.704 "c707694e-5df0-4f7d-bad7-5dcca759ca66" 00:21:01.704 ], 00:21:01.704 "product_name": "Malloc disk", 00:21:01.704 "block_size": 512, 00:21:01.704 "num_blocks": 65536, 00:21:01.704 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:21:01.704 "assigned_rate_limits": { 00:21:01.704 "rw_ios_per_sec": 0, 00:21:01.704 "rw_mbytes_per_sec": 0, 00:21:01.704 "r_mbytes_per_sec": 0, 00:21:01.704 "w_mbytes_per_sec": 0 00:21:01.704 }, 00:21:01.704 "claimed": true, 00:21:01.704 "claim_type": "exclusive_write", 00:21:01.704 "zoned": false, 00:21:01.704 "supported_io_types": { 00:21:01.704 "read": true, 00:21:01.704 "write": true, 00:21:01.704 "unmap": true, 00:21:01.704 "flush": true, 00:21:01.704 "reset": true, 00:21:01.704 "nvme_admin": false, 00:21:01.704 "nvme_io": false, 00:21:01.704 "nvme_io_md": false, 00:21:01.704 "write_zeroes": true, 00:21:01.704 "zcopy": true, 00:21:01.704 "get_zone_info": false, 00:21:01.704 "zone_management": false, 00:21:01.704 "zone_append": false, 00:21:01.704 "compare": false, 00:21:01.704 "compare_and_write": false, 00:21:01.704 "abort": true, 00:21:01.704 "seek_hole": false, 00:21:01.704 "seek_data": false, 00:21:01.704 "copy": true, 00:21:01.704 "nvme_iov_md": false 00:21:01.704 }, 00:21:01.704 "memory_domains": [ 00:21:01.704 { 00:21:01.704 "dma_device_id": "system", 00:21:01.704 "dma_device_type": 1 00:21:01.704 }, 00:21:01.704 { 00:21:01.704 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.704 "dma_device_type": 2 00:21:01.704 } 00:21:01.704 ], 00:21:01.704 "driver_specific": {} 00:21:01.704 }' 00:21:01.704 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.704 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:01.704 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:01.704 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.704 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:01.964 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:02.223 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:02.223 "name": "BaseBdev4", 00:21:02.223 "aliases": [ 00:21:02.223 "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef" 00:21:02.223 ], 00:21:02.223 "product_name": "Malloc disk", 00:21:02.223 "block_size": 512, 00:21:02.223 "num_blocks": 65536, 00:21:02.223 "uuid": "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef", 00:21:02.223 "assigned_rate_limits": { 00:21:02.223 "rw_ios_per_sec": 0, 00:21:02.223 "rw_mbytes_per_sec": 0, 00:21:02.223 "r_mbytes_per_sec": 0, 00:21:02.223 "w_mbytes_per_sec": 0 00:21:02.223 }, 00:21:02.223 "claimed": true, 00:21:02.223 "claim_type": "exclusive_write", 00:21:02.223 "zoned": false, 00:21:02.223 "supported_io_types": { 00:21:02.223 "read": true, 00:21:02.223 "write": true, 00:21:02.223 "unmap": true, 00:21:02.223 "flush": true, 00:21:02.223 "reset": true, 00:21:02.223 "nvme_admin": false, 00:21:02.223 "nvme_io": false, 00:21:02.223 "nvme_io_md": false, 00:21:02.223 "write_zeroes": true, 00:21:02.223 "zcopy": true, 00:21:02.223 "get_zone_info": false, 00:21:02.223 "zone_management": false, 00:21:02.223 "zone_append": false, 00:21:02.223 "compare": false, 00:21:02.223 "compare_and_write": false, 00:21:02.223 "abort": true, 00:21:02.223 "seek_hole": false, 00:21:02.223 "seek_data": false, 00:21:02.223 "copy": true, 00:21:02.223 "nvme_iov_md": false 00:21:02.223 }, 00:21:02.223 "memory_domains": [ 00:21:02.223 { 00:21:02.223 "dma_device_id": "system", 00:21:02.223 "dma_device_type": 1 00:21:02.223 }, 00:21:02.223 { 00:21:02.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.223 "dma_device_type": 2 00:21:02.223 } 00:21:02.223 ], 00:21:02.223 "driver_specific": {} 00:21:02.223 }' 00:21:02.223 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.223 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:02.223 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:02.224 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.483 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:02.483 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:02.484 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:02.747 [2024-07-12 15:57:23.070891] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.747 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.011 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.011 "name": "Existed_Raid", 00:21:03.011 "uuid": "1362d845-732f-470a-9944-451511a8a97f", 00:21:03.011 "strip_size_kb": 0, 00:21:03.011 "state": "online", 00:21:03.011 "raid_level": "raid1", 00:21:03.011 "superblock": false, 00:21:03.011 "num_base_bdevs": 4, 00:21:03.011 "num_base_bdevs_discovered": 3, 00:21:03.011 "num_base_bdevs_operational": 3, 00:21:03.011 "base_bdevs_list": [ 00:21:03.011 { 00:21:03.011 "name": null, 00:21:03.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.011 "is_configured": false, 00:21:03.011 "data_offset": 0, 00:21:03.011 "data_size": 65536 00:21:03.011 }, 00:21:03.011 { 00:21:03.011 "name": "BaseBdev2", 00:21:03.011 "uuid": "a462888b-a70a-42d1-b58b-11ac4478f90c", 00:21:03.011 "is_configured": true, 00:21:03.011 "data_offset": 0, 00:21:03.011 "data_size": 65536 00:21:03.011 }, 00:21:03.011 { 00:21:03.011 "name": "BaseBdev3", 00:21:03.011 "uuid": "c707694e-5df0-4f7d-bad7-5dcca759ca66", 00:21:03.011 "is_configured": true, 00:21:03.011 "data_offset": 0, 00:21:03.011 "data_size": 65536 00:21:03.011 }, 00:21:03.011 { 00:21:03.011 "name": "BaseBdev4", 00:21:03.011 "uuid": "b5b0c83b-4d1a-4393-9a4f-16ed9750f8ef", 00:21:03.011 "is_configured": true, 00:21:03.011 "data_offset": 0, 00:21:03.011 "data_size": 65536 00:21:03.011 } 00:21:03.011 ] 00:21:03.011 }' 00:21:03.011 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.011 15:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:03.581 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:03.581 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:03.581 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.581 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:03.581 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:03.581 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:03.581 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:03.840 [2024-07-12 15:57:24.181721] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:03.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:03.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:03.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:04.100 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:04.100 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:04.100 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:04.360 [2024-07-12 15:57:24.568549] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:04.360 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:04.621 [2024-07-12 15:57:24.955267] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:04.621 [2024-07-12 15:57:24.955322] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:04.621 [2024-07-12 15:57:24.961306] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:04.621 [2024-07-12 15:57:24.961331] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:04.621 [2024-07-12 15:57:24.961337] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x272f1d0 name Existed_Raid, state offline 00:21:04.621 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:04.621 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:04.622 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.622 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:04.882 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:05.142 BaseBdev2 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:05.142 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:05.402 [ 00:21:05.402 { 00:21:05.402 "name": "BaseBdev2", 00:21:05.402 "aliases": [ 00:21:05.402 "165539fe-18ce-474e-9243-62c48914f425" 00:21:05.402 ], 00:21:05.402 "product_name": "Malloc disk", 00:21:05.402 "block_size": 512, 00:21:05.402 "num_blocks": 65536, 00:21:05.402 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:05.402 "assigned_rate_limits": { 00:21:05.402 "rw_ios_per_sec": 0, 00:21:05.402 "rw_mbytes_per_sec": 0, 00:21:05.402 "r_mbytes_per_sec": 0, 00:21:05.402 "w_mbytes_per_sec": 0 00:21:05.402 }, 00:21:05.402 "claimed": false, 00:21:05.402 "zoned": false, 00:21:05.402 "supported_io_types": { 00:21:05.402 "read": true, 00:21:05.402 "write": true, 00:21:05.402 "unmap": true, 00:21:05.402 "flush": true, 00:21:05.402 "reset": true, 00:21:05.402 "nvme_admin": false, 00:21:05.402 "nvme_io": false, 00:21:05.402 "nvme_io_md": false, 00:21:05.403 "write_zeroes": true, 00:21:05.403 "zcopy": true, 00:21:05.403 "get_zone_info": false, 00:21:05.403 "zone_management": false, 00:21:05.403 "zone_append": false, 00:21:05.403 "compare": false, 00:21:05.403 "compare_and_write": false, 00:21:05.403 "abort": true, 00:21:05.403 "seek_hole": false, 00:21:05.403 "seek_data": false, 00:21:05.403 "copy": true, 00:21:05.403 "nvme_iov_md": false 00:21:05.403 }, 00:21:05.403 "memory_domains": [ 00:21:05.403 { 00:21:05.403 "dma_device_id": "system", 00:21:05.403 "dma_device_type": 1 00:21:05.403 }, 00:21:05.403 { 00:21:05.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.403 "dma_device_type": 2 00:21:05.403 } 00:21:05.403 ], 00:21:05.403 "driver_specific": {} 00:21:05.403 } 00:21:05.403 ] 00:21:05.403 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:05.403 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:05.403 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:05.403 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:05.662 BaseBdev3 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:05.662 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:05.923 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:05.923 [ 00:21:05.923 { 00:21:05.923 "name": "BaseBdev3", 00:21:05.923 "aliases": [ 00:21:05.923 "fb74761f-974b-417a-a757-c17011a065a3" 00:21:05.923 ], 00:21:05.923 "product_name": "Malloc disk", 00:21:05.923 "block_size": 512, 00:21:05.923 "num_blocks": 65536, 00:21:05.923 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:05.923 "assigned_rate_limits": { 00:21:05.923 "rw_ios_per_sec": 0, 00:21:05.923 "rw_mbytes_per_sec": 0, 00:21:05.923 "r_mbytes_per_sec": 0, 00:21:05.923 "w_mbytes_per_sec": 0 00:21:05.923 }, 00:21:05.923 "claimed": false, 00:21:05.923 "zoned": false, 00:21:05.923 "supported_io_types": { 00:21:05.923 "read": true, 00:21:05.923 "write": true, 00:21:05.923 "unmap": true, 00:21:05.923 "flush": true, 00:21:05.923 "reset": true, 00:21:05.923 "nvme_admin": false, 00:21:05.923 "nvme_io": false, 00:21:05.923 "nvme_io_md": false, 00:21:05.923 "write_zeroes": true, 00:21:05.923 "zcopy": true, 00:21:05.923 "get_zone_info": false, 00:21:05.923 "zone_management": false, 00:21:05.924 "zone_append": false, 00:21:05.924 "compare": false, 00:21:05.924 "compare_and_write": false, 00:21:05.924 "abort": true, 00:21:05.924 "seek_hole": false, 00:21:05.924 "seek_data": false, 00:21:05.924 "copy": true, 00:21:05.924 "nvme_iov_md": false 00:21:05.924 }, 00:21:05.924 "memory_domains": [ 00:21:05.924 { 00:21:05.924 "dma_device_id": "system", 00:21:05.924 "dma_device_type": 1 00:21:05.924 }, 00:21:05.924 { 00:21:05.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.924 "dma_device_type": 2 00:21:05.924 } 00:21:05.924 ], 00:21:05.924 "driver_specific": {} 00:21:05.924 } 00:21:05.924 ] 00:21:05.924 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:05.924 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:05.924 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:05.924 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:06.184 BaseBdev4 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:06.184 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:06.444 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:06.444 [ 00:21:06.444 { 00:21:06.444 "name": "BaseBdev4", 00:21:06.444 "aliases": [ 00:21:06.444 "80c6337f-7b65-4550-9169-42300d50339d" 00:21:06.444 ], 00:21:06.444 "product_name": "Malloc disk", 00:21:06.444 "block_size": 512, 00:21:06.444 "num_blocks": 65536, 00:21:06.444 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:06.444 "assigned_rate_limits": { 00:21:06.444 "rw_ios_per_sec": 0, 00:21:06.444 "rw_mbytes_per_sec": 0, 00:21:06.444 "r_mbytes_per_sec": 0, 00:21:06.444 "w_mbytes_per_sec": 0 00:21:06.444 }, 00:21:06.444 "claimed": false, 00:21:06.444 "zoned": false, 00:21:06.444 "supported_io_types": { 00:21:06.444 "read": true, 00:21:06.444 "write": true, 00:21:06.444 "unmap": true, 00:21:06.444 "flush": true, 00:21:06.444 "reset": true, 00:21:06.444 "nvme_admin": false, 00:21:06.444 "nvme_io": false, 00:21:06.444 "nvme_io_md": false, 00:21:06.444 "write_zeroes": true, 00:21:06.444 "zcopy": true, 00:21:06.444 "get_zone_info": false, 00:21:06.444 "zone_management": false, 00:21:06.444 "zone_append": false, 00:21:06.444 "compare": false, 00:21:06.444 "compare_and_write": false, 00:21:06.444 "abort": true, 00:21:06.444 "seek_hole": false, 00:21:06.444 "seek_data": false, 00:21:06.444 "copy": true, 00:21:06.444 "nvme_iov_md": false 00:21:06.444 }, 00:21:06.444 "memory_domains": [ 00:21:06.444 { 00:21:06.444 "dma_device_id": "system", 00:21:06.444 "dma_device_type": 1 00:21:06.444 }, 00:21:06.444 { 00:21:06.444 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.444 "dma_device_type": 2 00:21:06.444 } 00:21:06.444 ], 00:21:06.444 "driver_specific": {} 00:21:06.444 } 00:21:06.444 ] 00:21:06.444 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:06.444 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:06.444 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:06.444 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:06.703 [2024-07-12 15:57:27.062369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:06.703 [2024-07-12 15:57:27.062396] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:06.703 [2024-07-12 15:57:27.062408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:06.703 [2024-07-12 15:57:27.063428] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:06.703 [2024-07-12 15:57:27.063458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.703 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:06.963 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.963 "name": "Existed_Raid", 00:21:06.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.963 "strip_size_kb": 0, 00:21:06.964 "state": "configuring", 00:21:06.964 "raid_level": "raid1", 00:21:06.964 "superblock": false, 00:21:06.964 "num_base_bdevs": 4, 00:21:06.964 "num_base_bdevs_discovered": 3, 00:21:06.964 "num_base_bdevs_operational": 4, 00:21:06.964 "base_bdevs_list": [ 00:21:06.964 { 00:21:06.964 "name": "BaseBdev1", 00:21:06.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.964 "is_configured": false, 00:21:06.964 "data_offset": 0, 00:21:06.964 "data_size": 0 00:21:06.964 }, 00:21:06.964 { 00:21:06.964 "name": "BaseBdev2", 00:21:06.964 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:06.964 "is_configured": true, 00:21:06.964 "data_offset": 0, 00:21:06.964 "data_size": 65536 00:21:06.964 }, 00:21:06.964 { 00:21:06.964 "name": "BaseBdev3", 00:21:06.964 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:06.964 "is_configured": true, 00:21:06.964 "data_offset": 0, 00:21:06.964 "data_size": 65536 00:21:06.964 }, 00:21:06.964 { 00:21:06.964 "name": "BaseBdev4", 00:21:06.964 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:06.964 "is_configured": true, 00:21:06.964 "data_offset": 0, 00:21:06.964 "data_size": 65536 00:21:06.964 } 00:21:06.964 ] 00:21:06.964 }' 00:21:06.964 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.964 15:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:07.533 [2024-07-12 15:57:27.932535] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:07.533 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.793 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.793 "name": "Existed_Raid", 00:21:07.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.793 "strip_size_kb": 0, 00:21:07.793 "state": "configuring", 00:21:07.793 "raid_level": "raid1", 00:21:07.793 "superblock": false, 00:21:07.793 "num_base_bdevs": 4, 00:21:07.793 "num_base_bdevs_discovered": 2, 00:21:07.793 "num_base_bdevs_operational": 4, 00:21:07.793 "base_bdevs_list": [ 00:21:07.793 { 00:21:07.793 "name": "BaseBdev1", 00:21:07.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.793 "is_configured": false, 00:21:07.793 "data_offset": 0, 00:21:07.793 "data_size": 0 00:21:07.793 }, 00:21:07.793 { 00:21:07.793 "name": null, 00:21:07.793 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:07.793 "is_configured": false, 00:21:07.793 "data_offset": 0, 00:21:07.793 "data_size": 65536 00:21:07.793 }, 00:21:07.793 { 00:21:07.793 "name": "BaseBdev3", 00:21:07.793 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:07.793 "is_configured": true, 00:21:07.793 "data_offset": 0, 00:21:07.793 "data_size": 65536 00:21:07.793 }, 00:21:07.793 { 00:21:07.793 "name": "BaseBdev4", 00:21:07.793 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:07.793 "is_configured": true, 00:21:07.793 "data_offset": 0, 00:21:07.793 "data_size": 65536 00:21:07.793 } 00:21:07.793 ] 00:21:07.793 }' 00:21:07.793 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.793 15:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.363 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.363 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:08.622 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:08.622 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:08.622 [2024-07-12 15:57:29.052321] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:08.622 BaseBdev1 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:08.622 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.882 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:09.141 [ 00:21:09.141 { 00:21:09.141 "name": "BaseBdev1", 00:21:09.141 "aliases": [ 00:21:09.141 "4e88e1c1-9d16-4a49-a13f-da1832de47cd" 00:21:09.141 ], 00:21:09.141 "product_name": "Malloc disk", 00:21:09.141 "block_size": 512, 00:21:09.141 "num_blocks": 65536, 00:21:09.141 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:09.141 "assigned_rate_limits": { 00:21:09.141 "rw_ios_per_sec": 0, 00:21:09.141 "rw_mbytes_per_sec": 0, 00:21:09.141 "r_mbytes_per_sec": 0, 00:21:09.141 "w_mbytes_per_sec": 0 00:21:09.141 }, 00:21:09.141 "claimed": true, 00:21:09.141 "claim_type": "exclusive_write", 00:21:09.141 "zoned": false, 00:21:09.141 "supported_io_types": { 00:21:09.141 "read": true, 00:21:09.141 "write": true, 00:21:09.141 "unmap": true, 00:21:09.142 "flush": true, 00:21:09.142 "reset": true, 00:21:09.142 "nvme_admin": false, 00:21:09.142 "nvme_io": false, 00:21:09.142 "nvme_io_md": false, 00:21:09.142 "write_zeroes": true, 00:21:09.142 "zcopy": true, 00:21:09.142 "get_zone_info": false, 00:21:09.142 "zone_management": false, 00:21:09.142 "zone_append": false, 00:21:09.142 "compare": false, 00:21:09.142 "compare_and_write": false, 00:21:09.142 "abort": true, 00:21:09.142 "seek_hole": false, 00:21:09.142 "seek_data": false, 00:21:09.142 "copy": true, 00:21:09.142 "nvme_iov_md": false 00:21:09.142 }, 00:21:09.142 "memory_domains": [ 00:21:09.142 { 00:21:09.142 "dma_device_id": "system", 00:21:09.142 "dma_device_type": 1 00:21:09.142 }, 00:21:09.142 { 00:21:09.142 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.142 "dma_device_type": 2 00:21:09.142 } 00:21:09.142 ], 00:21:09.142 "driver_specific": {} 00:21:09.142 } 00:21:09.142 ] 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.142 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:09.401 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.401 "name": "Existed_Raid", 00:21:09.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.401 "strip_size_kb": 0, 00:21:09.401 "state": "configuring", 00:21:09.401 "raid_level": "raid1", 00:21:09.401 "superblock": false, 00:21:09.401 "num_base_bdevs": 4, 00:21:09.401 "num_base_bdevs_discovered": 3, 00:21:09.401 "num_base_bdevs_operational": 4, 00:21:09.401 "base_bdevs_list": [ 00:21:09.401 { 00:21:09.401 "name": "BaseBdev1", 00:21:09.401 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:09.401 "is_configured": true, 00:21:09.401 "data_offset": 0, 00:21:09.401 "data_size": 65536 00:21:09.401 }, 00:21:09.401 { 00:21:09.401 "name": null, 00:21:09.401 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:09.401 "is_configured": false, 00:21:09.401 "data_offset": 0, 00:21:09.401 "data_size": 65536 00:21:09.401 }, 00:21:09.401 { 00:21:09.401 "name": "BaseBdev3", 00:21:09.401 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:09.401 "is_configured": true, 00:21:09.401 "data_offset": 0, 00:21:09.401 "data_size": 65536 00:21:09.401 }, 00:21:09.401 { 00:21:09.401 "name": "BaseBdev4", 00:21:09.401 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:09.401 "is_configured": true, 00:21:09.401 "data_offset": 0, 00:21:09.401 "data_size": 65536 00:21:09.401 } 00:21:09.401 ] 00:21:09.401 }' 00:21:09.401 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.401 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:09.971 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.971 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:09.971 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:09.971 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:10.232 [2024-07-12 15:57:30.556149] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.232 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.516 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.516 "name": "Existed_Raid", 00:21:10.516 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.516 "strip_size_kb": 0, 00:21:10.516 "state": "configuring", 00:21:10.516 "raid_level": "raid1", 00:21:10.516 "superblock": false, 00:21:10.516 "num_base_bdevs": 4, 00:21:10.516 "num_base_bdevs_discovered": 2, 00:21:10.516 "num_base_bdevs_operational": 4, 00:21:10.516 "base_bdevs_list": [ 00:21:10.516 { 00:21:10.516 "name": "BaseBdev1", 00:21:10.516 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:10.516 "is_configured": true, 00:21:10.516 "data_offset": 0, 00:21:10.516 "data_size": 65536 00:21:10.516 }, 00:21:10.516 { 00:21:10.516 "name": null, 00:21:10.516 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:10.516 "is_configured": false, 00:21:10.516 "data_offset": 0, 00:21:10.516 "data_size": 65536 00:21:10.516 }, 00:21:10.516 { 00:21:10.516 "name": null, 00:21:10.516 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:10.516 "is_configured": false, 00:21:10.516 "data_offset": 0, 00:21:10.516 "data_size": 65536 00:21:10.516 }, 00:21:10.516 { 00:21:10.516 "name": "BaseBdev4", 00:21:10.516 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:10.516 "is_configured": true, 00:21:10.516 "data_offset": 0, 00:21:10.516 "data_size": 65536 00:21:10.516 } 00:21:10.516 ] 00:21:10.516 }' 00:21:10.516 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.516 15:57:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.087 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.087 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:11.087 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:11.087 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:11.346 [2024-07-12 15:57:31.687019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.346 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:11.607 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.607 "name": "Existed_Raid", 00:21:11.607 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.607 "strip_size_kb": 0, 00:21:11.607 "state": "configuring", 00:21:11.607 "raid_level": "raid1", 00:21:11.607 "superblock": false, 00:21:11.607 "num_base_bdevs": 4, 00:21:11.607 "num_base_bdevs_discovered": 3, 00:21:11.607 "num_base_bdevs_operational": 4, 00:21:11.607 "base_bdevs_list": [ 00:21:11.607 { 00:21:11.607 "name": "BaseBdev1", 00:21:11.607 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:11.607 "is_configured": true, 00:21:11.607 "data_offset": 0, 00:21:11.607 "data_size": 65536 00:21:11.607 }, 00:21:11.607 { 00:21:11.607 "name": null, 00:21:11.607 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:11.607 "is_configured": false, 00:21:11.607 "data_offset": 0, 00:21:11.607 "data_size": 65536 00:21:11.607 }, 00:21:11.607 { 00:21:11.607 "name": "BaseBdev3", 00:21:11.607 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:11.607 "is_configured": true, 00:21:11.607 "data_offset": 0, 00:21:11.607 "data_size": 65536 00:21:11.607 }, 00:21:11.607 { 00:21:11.607 "name": "BaseBdev4", 00:21:11.607 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:11.607 "is_configured": true, 00:21:11.607 "data_offset": 0, 00:21:11.607 "data_size": 65536 00:21:11.607 } 00:21:11.607 ] 00:21:11.607 }' 00:21:11.607 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.607 15:57:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.178 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.178 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:12.438 [2024-07-12 15:57:32.825901] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.438 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.439 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.439 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.439 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.699 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.699 "name": "Existed_Raid", 00:21:12.699 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.699 "strip_size_kb": 0, 00:21:12.699 "state": "configuring", 00:21:12.699 "raid_level": "raid1", 00:21:12.699 "superblock": false, 00:21:12.699 "num_base_bdevs": 4, 00:21:12.699 "num_base_bdevs_discovered": 2, 00:21:12.699 "num_base_bdevs_operational": 4, 00:21:12.699 "base_bdevs_list": [ 00:21:12.699 { 00:21:12.699 "name": null, 00:21:12.699 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:12.699 "is_configured": false, 00:21:12.699 "data_offset": 0, 00:21:12.699 "data_size": 65536 00:21:12.699 }, 00:21:12.699 { 00:21:12.699 "name": null, 00:21:12.699 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:12.699 "is_configured": false, 00:21:12.699 "data_offset": 0, 00:21:12.699 "data_size": 65536 00:21:12.699 }, 00:21:12.699 { 00:21:12.699 "name": "BaseBdev3", 00:21:12.699 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:12.699 "is_configured": true, 00:21:12.699 "data_offset": 0, 00:21:12.699 "data_size": 65536 00:21:12.699 }, 00:21:12.699 { 00:21:12.699 "name": "BaseBdev4", 00:21:12.699 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:12.699 "is_configured": true, 00:21:12.699 "data_offset": 0, 00:21:12.699 "data_size": 65536 00:21:12.699 } 00:21:12.699 ] 00:21:12.699 }' 00:21:12.699 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.699 15:57:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:13.268 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.268 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:13.527 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:13.527 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:13.787 [2024-07-12 15:57:33.974434] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.787 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.787 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.787 "name": "Existed_Raid", 00:21:13.787 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.787 "strip_size_kb": 0, 00:21:13.787 "state": "configuring", 00:21:13.787 "raid_level": "raid1", 00:21:13.787 "superblock": false, 00:21:13.787 "num_base_bdevs": 4, 00:21:13.787 "num_base_bdevs_discovered": 3, 00:21:13.787 "num_base_bdevs_operational": 4, 00:21:13.787 "base_bdevs_list": [ 00:21:13.787 { 00:21:13.787 "name": null, 00:21:13.787 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:13.787 "is_configured": false, 00:21:13.787 "data_offset": 0, 00:21:13.787 "data_size": 65536 00:21:13.787 }, 00:21:13.787 { 00:21:13.787 "name": "BaseBdev2", 00:21:13.787 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:13.787 "is_configured": true, 00:21:13.787 "data_offset": 0, 00:21:13.787 "data_size": 65536 00:21:13.787 }, 00:21:13.787 { 00:21:13.787 "name": "BaseBdev3", 00:21:13.787 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:13.787 "is_configured": true, 00:21:13.787 "data_offset": 0, 00:21:13.787 "data_size": 65536 00:21:13.787 }, 00:21:13.787 { 00:21:13.787 "name": "BaseBdev4", 00:21:13.787 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:13.787 "is_configured": true, 00:21:13.787 "data_offset": 0, 00:21:13.787 "data_size": 65536 00:21:13.787 } 00:21:13.787 ] 00:21:13.787 }' 00:21:13.787 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.787 15:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.357 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.357 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:14.616 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:14.616 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.616 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4e88e1c1-9d16-4a49-a13f-da1832de47cd 00:21:14.876 [2024-07-12 15:57:35.278491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:14.876 [2024-07-12 15:57:35.278514] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27327e0 00:21:14.876 [2024-07-12 15:57:35.278518] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:14.876 [2024-07-12 15:57:35.278668] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x272e8f0 00:21:14.876 [2024-07-12 15:57:35.278777] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27327e0 00:21:14.876 [2024-07-12 15:57:35.278783] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x27327e0 00:21:14.876 [2024-07-12 15:57:35.278907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:14.876 NewBaseBdev 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:14.876 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:15.136 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:15.396 [ 00:21:15.396 { 00:21:15.396 "name": "NewBaseBdev", 00:21:15.396 "aliases": [ 00:21:15.396 "4e88e1c1-9d16-4a49-a13f-da1832de47cd" 00:21:15.396 ], 00:21:15.396 "product_name": "Malloc disk", 00:21:15.396 "block_size": 512, 00:21:15.396 "num_blocks": 65536, 00:21:15.396 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:15.396 "assigned_rate_limits": { 00:21:15.396 "rw_ios_per_sec": 0, 00:21:15.396 "rw_mbytes_per_sec": 0, 00:21:15.396 "r_mbytes_per_sec": 0, 00:21:15.396 "w_mbytes_per_sec": 0 00:21:15.396 }, 00:21:15.396 "claimed": true, 00:21:15.396 "claim_type": "exclusive_write", 00:21:15.396 "zoned": false, 00:21:15.396 "supported_io_types": { 00:21:15.396 "read": true, 00:21:15.396 "write": true, 00:21:15.396 "unmap": true, 00:21:15.396 "flush": true, 00:21:15.396 "reset": true, 00:21:15.396 "nvme_admin": false, 00:21:15.396 "nvme_io": false, 00:21:15.396 "nvme_io_md": false, 00:21:15.396 "write_zeroes": true, 00:21:15.396 "zcopy": true, 00:21:15.396 "get_zone_info": false, 00:21:15.396 "zone_management": false, 00:21:15.396 "zone_append": false, 00:21:15.396 "compare": false, 00:21:15.396 "compare_and_write": false, 00:21:15.396 "abort": true, 00:21:15.396 "seek_hole": false, 00:21:15.397 "seek_data": false, 00:21:15.397 "copy": true, 00:21:15.397 "nvme_iov_md": false 00:21:15.397 }, 00:21:15.397 "memory_domains": [ 00:21:15.397 { 00:21:15.397 "dma_device_id": "system", 00:21:15.397 "dma_device_type": 1 00:21:15.397 }, 00:21:15.397 { 00:21:15.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:15.397 "dma_device_type": 2 00:21:15.397 } 00:21:15.397 ], 00:21:15.397 "driver_specific": {} 00:21:15.397 } 00:21:15.397 ] 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.397 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.657 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.657 "name": "Existed_Raid", 00:21:15.657 "uuid": "cd57c555-a344-4648-8e26-ee2ec5804cfb", 00:21:15.657 "strip_size_kb": 0, 00:21:15.657 "state": "online", 00:21:15.657 "raid_level": "raid1", 00:21:15.657 "superblock": false, 00:21:15.657 "num_base_bdevs": 4, 00:21:15.657 "num_base_bdevs_discovered": 4, 00:21:15.657 "num_base_bdevs_operational": 4, 00:21:15.657 "base_bdevs_list": [ 00:21:15.657 { 00:21:15.657 "name": "NewBaseBdev", 00:21:15.657 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:15.657 "is_configured": true, 00:21:15.657 "data_offset": 0, 00:21:15.657 "data_size": 65536 00:21:15.657 }, 00:21:15.657 { 00:21:15.657 "name": "BaseBdev2", 00:21:15.657 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:15.657 "is_configured": true, 00:21:15.657 "data_offset": 0, 00:21:15.657 "data_size": 65536 00:21:15.657 }, 00:21:15.657 { 00:21:15.657 "name": "BaseBdev3", 00:21:15.657 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:15.657 "is_configured": true, 00:21:15.657 "data_offset": 0, 00:21:15.657 "data_size": 65536 00:21:15.657 }, 00:21:15.657 { 00:21:15.657 "name": "BaseBdev4", 00:21:15.657 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:15.657 "is_configured": true, 00:21:15.657 "data_offset": 0, 00:21:15.657 "data_size": 65536 00:21:15.657 } 00:21:15.657 ] 00:21:15.657 }' 00:21:15.657 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.657 15:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:16.228 [2024-07-12 15:57:36.574027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:16.228 "name": "Existed_Raid", 00:21:16.228 "aliases": [ 00:21:16.228 "cd57c555-a344-4648-8e26-ee2ec5804cfb" 00:21:16.228 ], 00:21:16.228 "product_name": "Raid Volume", 00:21:16.228 "block_size": 512, 00:21:16.228 "num_blocks": 65536, 00:21:16.228 "uuid": "cd57c555-a344-4648-8e26-ee2ec5804cfb", 00:21:16.228 "assigned_rate_limits": { 00:21:16.228 "rw_ios_per_sec": 0, 00:21:16.228 "rw_mbytes_per_sec": 0, 00:21:16.228 "r_mbytes_per_sec": 0, 00:21:16.228 "w_mbytes_per_sec": 0 00:21:16.228 }, 00:21:16.228 "claimed": false, 00:21:16.228 "zoned": false, 00:21:16.228 "supported_io_types": { 00:21:16.228 "read": true, 00:21:16.228 "write": true, 00:21:16.228 "unmap": false, 00:21:16.228 "flush": false, 00:21:16.228 "reset": true, 00:21:16.228 "nvme_admin": false, 00:21:16.228 "nvme_io": false, 00:21:16.228 "nvme_io_md": false, 00:21:16.228 "write_zeroes": true, 00:21:16.228 "zcopy": false, 00:21:16.228 "get_zone_info": false, 00:21:16.228 "zone_management": false, 00:21:16.228 "zone_append": false, 00:21:16.228 "compare": false, 00:21:16.228 "compare_and_write": false, 00:21:16.228 "abort": false, 00:21:16.228 "seek_hole": false, 00:21:16.228 "seek_data": false, 00:21:16.228 "copy": false, 00:21:16.228 "nvme_iov_md": false 00:21:16.228 }, 00:21:16.228 "memory_domains": [ 00:21:16.228 { 00:21:16.228 "dma_device_id": "system", 00:21:16.228 "dma_device_type": 1 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.228 "dma_device_type": 2 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "system", 00:21:16.228 "dma_device_type": 1 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.228 "dma_device_type": 2 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "system", 00:21:16.228 "dma_device_type": 1 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.228 "dma_device_type": 2 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "system", 00:21:16.228 "dma_device_type": 1 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.228 "dma_device_type": 2 00:21:16.228 } 00:21:16.228 ], 00:21:16.228 "driver_specific": { 00:21:16.228 "raid": { 00:21:16.228 "uuid": "cd57c555-a344-4648-8e26-ee2ec5804cfb", 00:21:16.228 "strip_size_kb": 0, 00:21:16.228 "state": "online", 00:21:16.228 "raid_level": "raid1", 00:21:16.228 "superblock": false, 00:21:16.228 "num_base_bdevs": 4, 00:21:16.228 "num_base_bdevs_discovered": 4, 00:21:16.228 "num_base_bdevs_operational": 4, 00:21:16.228 "base_bdevs_list": [ 00:21:16.228 { 00:21:16.228 "name": "NewBaseBdev", 00:21:16.228 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:16.228 "is_configured": true, 00:21:16.228 "data_offset": 0, 00:21:16.228 "data_size": 65536 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "name": "BaseBdev2", 00:21:16.228 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:16.228 "is_configured": true, 00:21:16.228 "data_offset": 0, 00:21:16.228 "data_size": 65536 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "name": "BaseBdev3", 00:21:16.228 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:16.228 "is_configured": true, 00:21:16.228 "data_offset": 0, 00:21:16.228 "data_size": 65536 00:21:16.228 }, 00:21:16.228 { 00:21:16.228 "name": "BaseBdev4", 00:21:16.228 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:16.228 "is_configured": true, 00:21:16.228 "data_offset": 0, 00:21:16.228 "data_size": 65536 00:21:16.228 } 00:21:16.228 ] 00:21:16.228 } 00:21:16.228 } 00:21:16.228 }' 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:16.228 BaseBdev2 00:21:16.228 BaseBdev3 00:21:16.228 BaseBdev4' 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:16.228 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:16.487 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:16.487 "name": "NewBaseBdev", 00:21:16.487 "aliases": [ 00:21:16.487 "4e88e1c1-9d16-4a49-a13f-da1832de47cd" 00:21:16.487 ], 00:21:16.487 "product_name": "Malloc disk", 00:21:16.487 "block_size": 512, 00:21:16.487 "num_blocks": 65536, 00:21:16.487 "uuid": "4e88e1c1-9d16-4a49-a13f-da1832de47cd", 00:21:16.487 "assigned_rate_limits": { 00:21:16.487 "rw_ios_per_sec": 0, 00:21:16.487 "rw_mbytes_per_sec": 0, 00:21:16.487 "r_mbytes_per_sec": 0, 00:21:16.487 "w_mbytes_per_sec": 0 00:21:16.487 }, 00:21:16.487 "claimed": true, 00:21:16.487 "claim_type": "exclusive_write", 00:21:16.487 "zoned": false, 00:21:16.487 "supported_io_types": { 00:21:16.487 "read": true, 00:21:16.487 "write": true, 00:21:16.487 "unmap": true, 00:21:16.487 "flush": true, 00:21:16.487 "reset": true, 00:21:16.487 "nvme_admin": false, 00:21:16.487 "nvme_io": false, 00:21:16.487 "nvme_io_md": false, 00:21:16.487 "write_zeroes": true, 00:21:16.487 "zcopy": true, 00:21:16.487 "get_zone_info": false, 00:21:16.487 "zone_management": false, 00:21:16.487 "zone_append": false, 00:21:16.487 "compare": false, 00:21:16.487 "compare_and_write": false, 00:21:16.487 "abort": true, 00:21:16.487 "seek_hole": false, 00:21:16.487 "seek_data": false, 00:21:16.487 "copy": true, 00:21:16.487 "nvme_iov_md": false 00:21:16.487 }, 00:21:16.487 "memory_domains": [ 00:21:16.487 { 00:21:16.487 "dma_device_id": "system", 00:21:16.487 "dma_device_type": 1 00:21:16.487 }, 00:21:16.487 { 00:21:16.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:16.487 "dma_device_type": 2 00:21:16.487 } 00:21:16.487 ], 00:21:16.487 "driver_specific": {} 00:21:16.487 }' 00:21:16.487 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.487 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:16.487 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:16.487 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.747 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:16.747 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:17.006 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:17.006 "name": "BaseBdev2", 00:21:17.006 "aliases": [ 00:21:17.006 "165539fe-18ce-474e-9243-62c48914f425" 00:21:17.006 ], 00:21:17.006 "product_name": "Malloc disk", 00:21:17.006 "block_size": 512, 00:21:17.006 "num_blocks": 65536, 00:21:17.006 "uuid": "165539fe-18ce-474e-9243-62c48914f425", 00:21:17.006 "assigned_rate_limits": { 00:21:17.006 "rw_ios_per_sec": 0, 00:21:17.006 "rw_mbytes_per_sec": 0, 00:21:17.006 "r_mbytes_per_sec": 0, 00:21:17.006 "w_mbytes_per_sec": 0 00:21:17.006 }, 00:21:17.006 "claimed": true, 00:21:17.006 "claim_type": "exclusive_write", 00:21:17.006 "zoned": false, 00:21:17.006 "supported_io_types": { 00:21:17.006 "read": true, 00:21:17.006 "write": true, 00:21:17.006 "unmap": true, 00:21:17.006 "flush": true, 00:21:17.006 "reset": true, 00:21:17.006 "nvme_admin": false, 00:21:17.006 "nvme_io": false, 00:21:17.006 "nvme_io_md": false, 00:21:17.006 "write_zeroes": true, 00:21:17.006 "zcopy": true, 00:21:17.006 "get_zone_info": false, 00:21:17.006 "zone_management": false, 00:21:17.006 "zone_append": false, 00:21:17.006 "compare": false, 00:21:17.006 "compare_and_write": false, 00:21:17.006 "abort": true, 00:21:17.006 "seek_hole": false, 00:21:17.006 "seek_data": false, 00:21:17.006 "copy": true, 00:21:17.006 "nvme_iov_md": false 00:21:17.006 }, 00:21:17.006 "memory_domains": [ 00:21:17.006 { 00:21:17.006 "dma_device_id": "system", 00:21:17.006 "dma_device_type": 1 00:21:17.007 }, 00:21:17.007 { 00:21:17.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:17.007 "dma_device_type": 2 00:21:17.007 } 00:21:17.007 ], 00:21:17.007 "driver_specific": {} 00:21:17.007 }' 00:21:17.007 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:17.007 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:17.266 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:17.525 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:17.525 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:17.525 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:17.525 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:17.525 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.133 "name": "BaseBdev3", 00:21:18.133 "aliases": [ 00:21:18.133 "fb74761f-974b-417a-a757-c17011a065a3" 00:21:18.133 ], 00:21:18.133 "product_name": "Malloc disk", 00:21:18.133 "block_size": 512, 00:21:18.133 "num_blocks": 65536, 00:21:18.133 "uuid": "fb74761f-974b-417a-a757-c17011a065a3", 00:21:18.133 "assigned_rate_limits": { 00:21:18.133 "rw_ios_per_sec": 0, 00:21:18.133 "rw_mbytes_per_sec": 0, 00:21:18.133 "r_mbytes_per_sec": 0, 00:21:18.133 "w_mbytes_per_sec": 0 00:21:18.133 }, 00:21:18.133 "claimed": true, 00:21:18.133 "claim_type": "exclusive_write", 00:21:18.133 "zoned": false, 00:21:18.133 "supported_io_types": { 00:21:18.133 "read": true, 00:21:18.133 "write": true, 00:21:18.133 "unmap": true, 00:21:18.133 "flush": true, 00:21:18.133 "reset": true, 00:21:18.133 "nvme_admin": false, 00:21:18.133 "nvme_io": false, 00:21:18.133 "nvme_io_md": false, 00:21:18.133 "write_zeroes": true, 00:21:18.133 "zcopy": true, 00:21:18.133 "get_zone_info": false, 00:21:18.133 "zone_management": false, 00:21:18.133 "zone_append": false, 00:21:18.133 "compare": false, 00:21:18.133 "compare_and_write": false, 00:21:18.133 "abort": true, 00:21:18.133 "seek_hole": false, 00:21:18.133 "seek_data": false, 00:21:18.133 "copy": true, 00:21:18.133 "nvme_iov_md": false 00:21:18.133 }, 00:21:18.133 "memory_domains": [ 00:21:18.133 { 00:21:18.133 "dma_device_id": "system", 00:21:18.133 "dma_device_type": 1 00:21:18.133 }, 00:21:18.133 { 00:21:18.133 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.133 "dma_device_type": 2 00:21:18.133 } 00:21:18.133 ], 00:21:18.133 "driver_specific": {} 00:21:18.133 }' 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.133 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.392 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:18.651 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:18.651 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:18.651 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:18.651 15:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:18.651 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:18.651 "name": "BaseBdev4", 00:21:18.651 "aliases": [ 00:21:18.651 "80c6337f-7b65-4550-9169-42300d50339d" 00:21:18.651 ], 00:21:18.651 "product_name": "Malloc disk", 00:21:18.651 "block_size": 512, 00:21:18.651 "num_blocks": 65536, 00:21:18.651 "uuid": "80c6337f-7b65-4550-9169-42300d50339d", 00:21:18.651 "assigned_rate_limits": { 00:21:18.651 "rw_ios_per_sec": 0, 00:21:18.651 "rw_mbytes_per_sec": 0, 00:21:18.651 "r_mbytes_per_sec": 0, 00:21:18.651 "w_mbytes_per_sec": 0 00:21:18.651 }, 00:21:18.651 "claimed": true, 00:21:18.651 "claim_type": "exclusive_write", 00:21:18.651 "zoned": false, 00:21:18.651 "supported_io_types": { 00:21:18.651 "read": true, 00:21:18.651 "write": true, 00:21:18.651 "unmap": true, 00:21:18.651 "flush": true, 00:21:18.651 "reset": true, 00:21:18.651 "nvme_admin": false, 00:21:18.651 "nvme_io": false, 00:21:18.651 "nvme_io_md": false, 00:21:18.651 "write_zeroes": true, 00:21:18.651 "zcopy": true, 00:21:18.651 "get_zone_info": false, 00:21:18.651 "zone_management": false, 00:21:18.651 "zone_append": false, 00:21:18.651 "compare": false, 00:21:18.651 "compare_and_write": false, 00:21:18.651 "abort": true, 00:21:18.651 "seek_hole": false, 00:21:18.651 "seek_data": false, 00:21:18.651 "copy": true, 00:21:18.651 "nvme_iov_md": false 00:21:18.651 }, 00:21:18.651 "memory_domains": [ 00:21:18.651 { 00:21:18.651 "dma_device_id": "system", 00:21:18.651 "dma_device_type": 1 00:21:18.651 }, 00:21:18.651 { 00:21:18.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:18.651 "dma_device_type": 2 00:21:18.651 } 00:21:18.651 ], 00:21:18.651 "driver_specific": {} 00:21:18.651 }' 00:21:18.651 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:18.910 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:19.170 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:19.429 [2024-07-12 15:57:39.725791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:19.429 [2024-07-12 15:57:39.725807] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:19.429 [2024-07-12 15:57:39.725844] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:19.429 [2024-07-12 15:57:39.726055] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:19.429 [2024-07-12 15:57:39.726062] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27327e0 name Existed_Raid, state offline 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2601158 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 2601158 ']' 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 2601158 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2601158 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2601158' 00:21:19.429 killing process with pid 2601158 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 2601158 00:21:19.429 [2024-07-12 15:57:39.809420] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:19.429 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 2601158 00:21:19.429 [2024-07-12 15:57:39.829751] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:19.697 15:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:19.697 00:21:19.697 real 0m27.953s 00:21:19.697 user 0m52.476s 00:21:19.697 sys 0m4.094s 00:21:19.697 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:19.697 15:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.697 ************************************ 00:21:19.697 END TEST raid_state_function_test 00:21:19.697 ************************************ 00:21:19.697 15:57:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:19.697 15:57:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:21:19.697 15:57:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:19.697 15:57:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:19.697 15:57:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:19.697 ************************************ 00:21:19.697 START TEST raid_state_function_test_sb 00:21:19.697 ************************************ 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2606430 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2606430' 00:21:19.697 Process raid pid: 2606430 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2606430 /var/tmp/spdk-raid.sock 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2606430 ']' 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:19.697 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:19.697 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:19.697 [2024-07-12 15:57:40.094883] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:21:19.697 [2024-07-12 15:57:40.094937] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:19.971 [2024-07-12 15:57:40.184606] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:19.971 [2024-07-12 15:57:40.250843] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:19.971 [2024-07-12 15:57:40.294698] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:19.971 [2024-07-12 15:57:40.294725] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:20.542 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:20.542 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:20.542 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:21.112 [2024-07-12 15:57:41.466656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:21.112 [2024-07-12 15:57:41.466689] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:21.112 [2024-07-12 15:57:41.466695] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:21.112 [2024-07-12 15:57:41.466701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:21.112 [2024-07-12 15:57:41.466706] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:21.112 [2024-07-12 15:57:41.466716] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:21.112 [2024-07-12 15:57:41.466721] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:21.112 [2024-07-12 15:57:41.466726] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.112 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:21.372 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.372 "name": "Existed_Raid", 00:21:21.372 "uuid": "82c5e0cf-2583-4846-9dcb-4897f68f661f", 00:21:21.372 "strip_size_kb": 0, 00:21:21.372 "state": "configuring", 00:21:21.372 "raid_level": "raid1", 00:21:21.372 "superblock": true, 00:21:21.372 "num_base_bdevs": 4, 00:21:21.372 "num_base_bdevs_discovered": 0, 00:21:21.372 "num_base_bdevs_operational": 4, 00:21:21.372 "base_bdevs_list": [ 00:21:21.372 { 00:21:21.372 "name": "BaseBdev1", 00:21:21.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.372 "is_configured": false, 00:21:21.372 "data_offset": 0, 00:21:21.372 "data_size": 0 00:21:21.372 }, 00:21:21.372 { 00:21:21.372 "name": "BaseBdev2", 00:21:21.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.372 "is_configured": false, 00:21:21.372 "data_offset": 0, 00:21:21.372 "data_size": 0 00:21:21.372 }, 00:21:21.372 { 00:21:21.372 "name": "BaseBdev3", 00:21:21.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.372 "is_configured": false, 00:21:21.372 "data_offset": 0, 00:21:21.372 "data_size": 0 00:21:21.372 }, 00:21:21.372 { 00:21:21.372 "name": "BaseBdev4", 00:21:21.372 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:21.372 "is_configured": false, 00:21:21.372 "data_offset": 0, 00:21:21.372 "data_size": 0 00:21:21.372 } 00:21:21.372 ] 00:21:21.372 }' 00:21:21.372 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.372 15:57:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:22.311 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:22.882 [2024-07-12 15:57:43.174816] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:22.882 [2024-07-12 15:57:43.174837] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1feb920 name Existed_Raid, state configuring 00:21:22.882 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:23.142 [2024-07-12 15:57:43.383369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:23.142 [2024-07-12 15:57:43.383386] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:23.142 [2024-07-12 15:57:43.383391] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:23.142 [2024-07-12 15:57:43.383397] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:23.142 [2024-07-12 15:57:43.383401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:23.142 [2024-07-12 15:57:43.383407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:23.142 [2024-07-12 15:57:43.383411] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:23.142 [2024-07-12 15:57:43.383417] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:23.142 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:23.142 [2024-07-12 15:57:43.582414] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:23.142 BaseBdev1 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:23.401 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:23.660 [ 00:21:23.660 { 00:21:23.661 "name": "BaseBdev1", 00:21:23.661 "aliases": [ 00:21:23.661 "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa" 00:21:23.661 ], 00:21:23.661 "product_name": "Malloc disk", 00:21:23.661 "block_size": 512, 00:21:23.661 "num_blocks": 65536, 00:21:23.661 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:23.661 "assigned_rate_limits": { 00:21:23.661 "rw_ios_per_sec": 0, 00:21:23.661 "rw_mbytes_per_sec": 0, 00:21:23.661 "r_mbytes_per_sec": 0, 00:21:23.661 "w_mbytes_per_sec": 0 00:21:23.661 }, 00:21:23.661 "claimed": true, 00:21:23.661 "claim_type": "exclusive_write", 00:21:23.661 "zoned": false, 00:21:23.661 "supported_io_types": { 00:21:23.661 "read": true, 00:21:23.661 "write": true, 00:21:23.661 "unmap": true, 00:21:23.661 "flush": true, 00:21:23.661 "reset": true, 00:21:23.661 "nvme_admin": false, 00:21:23.661 "nvme_io": false, 00:21:23.661 "nvme_io_md": false, 00:21:23.661 "write_zeroes": true, 00:21:23.661 "zcopy": true, 00:21:23.661 "get_zone_info": false, 00:21:23.661 "zone_management": false, 00:21:23.661 "zone_append": false, 00:21:23.661 "compare": false, 00:21:23.661 "compare_and_write": false, 00:21:23.661 "abort": true, 00:21:23.661 "seek_hole": false, 00:21:23.661 "seek_data": false, 00:21:23.661 "copy": true, 00:21:23.661 "nvme_iov_md": false 00:21:23.661 }, 00:21:23.661 "memory_domains": [ 00:21:23.661 { 00:21:23.661 "dma_device_id": "system", 00:21:23.661 "dma_device_type": 1 00:21:23.661 }, 00:21:23.661 { 00:21:23.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.661 "dma_device_type": 2 00:21:23.661 } 00:21:23.661 ], 00:21:23.661 "driver_specific": {} 00:21:23.661 } 00:21:23.661 ] 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:23.661 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:23.921 15:57:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:23.921 "name": "Existed_Raid", 00:21:23.921 "uuid": "d40e1e53-adbd-4fc8-81e2-015e036133e0", 00:21:23.921 "strip_size_kb": 0, 00:21:23.921 "state": "configuring", 00:21:23.921 "raid_level": "raid1", 00:21:23.921 "superblock": true, 00:21:23.921 "num_base_bdevs": 4, 00:21:23.921 "num_base_bdevs_discovered": 1, 00:21:23.921 "num_base_bdevs_operational": 4, 00:21:23.921 "base_bdevs_list": [ 00:21:23.921 { 00:21:23.921 "name": "BaseBdev1", 00:21:23.921 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:23.921 "is_configured": true, 00:21:23.921 "data_offset": 2048, 00:21:23.921 "data_size": 63488 00:21:23.921 }, 00:21:23.921 { 00:21:23.921 "name": "BaseBdev2", 00:21:23.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.921 "is_configured": false, 00:21:23.921 "data_offset": 0, 00:21:23.921 "data_size": 0 00:21:23.921 }, 00:21:23.921 { 00:21:23.921 "name": "BaseBdev3", 00:21:23.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.921 "is_configured": false, 00:21:23.921 "data_offset": 0, 00:21:23.921 "data_size": 0 00:21:23.921 }, 00:21:23.921 { 00:21:23.921 "name": "BaseBdev4", 00:21:23.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:23.921 "is_configured": false, 00:21:23.921 "data_offset": 0, 00:21:23.921 "data_size": 0 00:21:23.921 } 00:21:23.921 ] 00:21:23.921 }' 00:21:23.921 15:57:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:23.921 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:24.861 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:25.121 [2024-07-12 15:57:45.382954] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:25.121 [2024-07-12 15:57:45.382982] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1feb190 name Existed_Raid, state configuring 00:21:25.121 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:25.691 [2024-07-12 15:57:45.904285] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:25.691 [2024-07-12 15:57:45.905428] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:25.691 [2024-07-12 15:57:45.905451] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:25.691 [2024-07-12 15:57:45.905457] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:25.691 [2024-07-12 15:57:45.905463] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:25.691 [2024-07-12 15:57:45.905468] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:25.691 [2024-07-12 15:57:45.905473] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:25.691 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:26.261 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:26.261 "name": "Existed_Raid", 00:21:26.261 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:26.261 "strip_size_kb": 0, 00:21:26.261 "state": "configuring", 00:21:26.261 "raid_level": "raid1", 00:21:26.261 "superblock": true, 00:21:26.261 "num_base_bdevs": 4, 00:21:26.261 "num_base_bdevs_discovered": 1, 00:21:26.261 "num_base_bdevs_operational": 4, 00:21:26.261 "base_bdevs_list": [ 00:21:26.261 { 00:21:26.261 "name": "BaseBdev1", 00:21:26.261 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:26.261 "is_configured": true, 00:21:26.261 "data_offset": 2048, 00:21:26.261 "data_size": 63488 00:21:26.261 }, 00:21:26.261 { 00:21:26.261 "name": "BaseBdev2", 00:21:26.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.261 "is_configured": false, 00:21:26.261 "data_offset": 0, 00:21:26.261 "data_size": 0 00:21:26.261 }, 00:21:26.261 { 00:21:26.261 "name": "BaseBdev3", 00:21:26.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.261 "is_configured": false, 00:21:26.261 "data_offset": 0, 00:21:26.261 "data_size": 0 00:21:26.261 }, 00:21:26.261 { 00:21:26.261 "name": "BaseBdev4", 00:21:26.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:26.261 "is_configured": false, 00:21:26.261 "data_offset": 0, 00:21:26.261 "data_size": 0 00:21:26.262 } 00:21:26.262 ] 00:21:26.262 }' 00:21:26.262 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:26.262 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:27.203 [2024-07-12 15:57:47.545408] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:27.203 BaseBdev2 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:27.203 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:27.463 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:27.723 [ 00:21:27.723 { 00:21:27.723 "name": "BaseBdev2", 00:21:27.723 "aliases": [ 00:21:27.723 "aad190c3-2df6-4e97-8b03-e9b73d52f396" 00:21:27.723 ], 00:21:27.723 "product_name": "Malloc disk", 00:21:27.723 "block_size": 512, 00:21:27.723 "num_blocks": 65536, 00:21:27.723 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:27.723 "assigned_rate_limits": { 00:21:27.723 "rw_ios_per_sec": 0, 00:21:27.723 "rw_mbytes_per_sec": 0, 00:21:27.723 "r_mbytes_per_sec": 0, 00:21:27.723 "w_mbytes_per_sec": 0 00:21:27.723 }, 00:21:27.723 "claimed": true, 00:21:27.723 "claim_type": "exclusive_write", 00:21:27.723 "zoned": false, 00:21:27.723 "supported_io_types": { 00:21:27.723 "read": true, 00:21:27.723 "write": true, 00:21:27.723 "unmap": true, 00:21:27.723 "flush": true, 00:21:27.723 "reset": true, 00:21:27.723 "nvme_admin": false, 00:21:27.723 "nvme_io": false, 00:21:27.723 "nvme_io_md": false, 00:21:27.723 "write_zeroes": true, 00:21:27.723 "zcopy": true, 00:21:27.723 "get_zone_info": false, 00:21:27.723 "zone_management": false, 00:21:27.723 "zone_append": false, 00:21:27.723 "compare": false, 00:21:27.723 "compare_and_write": false, 00:21:27.723 "abort": true, 00:21:27.723 "seek_hole": false, 00:21:27.723 "seek_data": false, 00:21:27.723 "copy": true, 00:21:27.723 "nvme_iov_md": false 00:21:27.723 }, 00:21:27.723 "memory_domains": [ 00:21:27.723 { 00:21:27.723 "dma_device_id": "system", 00:21:27.723 "dma_device_type": 1 00:21:27.723 }, 00:21:27.723 { 00:21:27.723 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:27.723 "dma_device_type": 2 00:21:27.723 } 00:21:27.723 ], 00:21:27.723 "driver_specific": {} 00:21:27.723 } 00:21:27.723 ] 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.723 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.723 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.723 "name": "Existed_Raid", 00:21:27.723 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:27.723 "strip_size_kb": 0, 00:21:27.723 "state": "configuring", 00:21:27.723 "raid_level": "raid1", 00:21:27.723 "superblock": true, 00:21:27.723 "num_base_bdevs": 4, 00:21:27.723 "num_base_bdevs_discovered": 2, 00:21:27.723 "num_base_bdevs_operational": 4, 00:21:27.723 "base_bdevs_list": [ 00:21:27.723 { 00:21:27.723 "name": "BaseBdev1", 00:21:27.723 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:27.723 "is_configured": true, 00:21:27.723 "data_offset": 2048, 00:21:27.723 "data_size": 63488 00:21:27.723 }, 00:21:27.723 { 00:21:27.723 "name": "BaseBdev2", 00:21:27.723 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:27.723 "is_configured": true, 00:21:27.723 "data_offset": 2048, 00:21:27.723 "data_size": 63488 00:21:27.723 }, 00:21:27.723 { 00:21:27.723 "name": "BaseBdev3", 00:21:27.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.723 "is_configured": false, 00:21:27.723 "data_offset": 0, 00:21:27.723 "data_size": 0 00:21:27.723 }, 00:21:27.723 { 00:21:27.723 "name": "BaseBdev4", 00:21:27.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.723 "is_configured": false, 00:21:27.723 "data_offset": 0, 00:21:27.723 "data_size": 0 00:21:27.723 } 00:21:27.723 ] 00:21:27.723 }' 00:21:27.723 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.723 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:28.293 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:28.554 [2024-07-12 15:57:48.865773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:28.554 BaseBdev3 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:28.554 15:57:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:28.814 [ 00:21:28.814 { 00:21:28.814 "name": "BaseBdev3", 00:21:28.814 "aliases": [ 00:21:28.814 "9c5db866-7c32-49eb-84ae-1e0de813481e" 00:21:28.814 ], 00:21:28.814 "product_name": "Malloc disk", 00:21:28.814 "block_size": 512, 00:21:28.814 "num_blocks": 65536, 00:21:28.814 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:28.814 "assigned_rate_limits": { 00:21:28.814 "rw_ios_per_sec": 0, 00:21:28.814 "rw_mbytes_per_sec": 0, 00:21:28.814 "r_mbytes_per_sec": 0, 00:21:28.814 "w_mbytes_per_sec": 0 00:21:28.814 }, 00:21:28.814 "claimed": true, 00:21:28.814 "claim_type": "exclusive_write", 00:21:28.814 "zoned": false, 00:21:28.814 "supported_io_types": { 00:21:28.814 "read": true, 00:21:28.814 "write": true, 00:21:28.814 "unmap": true, 00:21:28.814 "flush": true, 00:21:28.814 "reset": true, 00:21:28.814 "nvme_admin": false, 00:21:28.814 "nvme_io": false, 00:21:28.814 "nvme_io_md": false, 00:21:28.814 "write_zeroes": true, 00:21:28.814 "zcopy": true, 00:21:28.814 "get_zone_info": false, 00:21:28.814 "zone_management": false, 00:21:28.814 "zone_append": false, 00:21:28.814 "compare": false, 00:21:28.814 "compare_and_write": false, 00:21:28.814 "abort": true, 00:21:28.814 "seek_hole": false, 00:21:28.814 "seek_data": false, 00:21:28.814 "copy": true, 00:21:28.814 "nvme_iov_md": false 00:21:28.814 }, 00:21:28.814 "memory_domains": [ 00:21:28.814 { 00:21:28.814 "dma_device_id": "system", 00:21:28.814 "dma_device_type": 1 00:21:28.814 }, 00:21:28.814 { 00:21:28.814 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:28.814 "dma_device_type": 2 00:21:28.814 } 00:21:28.814 ], 00:21:28.814 "driver_specific": {} 00:21:28.814 } 00:21:28.814 ] 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.814 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.075 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.075 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:29.075 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.075 "name": "Existed_Raid", 00:21:29.075 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:29.075 "strip_size_kb": 0, 00:21:29.075 "state": "configuring", 00:21:29.075 "raid_level": "raid1", 00:21:29.075 "superblock": true, 00:21:29.075 "num_base_bdevs": 4, 00:21:29.075 "num_base_bdevs_discovered": 3, 00:21:29.075 "num_base_bdevs_operational": 4, 00:21:29.075 "base_bdevs_list": [ 00:21:29.075 { 00:21:29.075 "name": "BaseBdev1", 00:21:29.075 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:29.075 "is_configured": true, 00:21:29.075 "data_offset": 2048, 00:21:29.075 "data_size": 63488 00:21:29.075 }, 00:21:29.075 { 00:21:29.075 "name": "BaseBdev2", 00:21:29.075 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:29.075 "is_configured": true, 00:21:29.075 "data_offset": 2048, 00:21:29.075 "data_size": 63488 00:21:29.075 }, 00:21:29.075 { 00:21:29.075 "name": "BaseBdev3", 00:21:29.075 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:29.075 "is_configured": true, 00:21:29.075 "data_offset": 2048, 00:21:29.075 "data_size": 63488 00:21:29.075 }, 00:21:29.075 { 00:21:29.075 "name": "BaseBdev4", 00:21:29.075 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:29.075 "is_configured": false, 00:21:29.075 "data_offset": 0, 00:21:29.075 "data_size": 0 00:21:29.075 } 00:21:29.075 ] 00:21:29.075 }' 00:21:29.075 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.075 15:57:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:29.645 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:29.904 [2024-07-12 15:57:50.165943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:29.904 [2024-07-12 15:57:50.166067] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fec1d0 00:21:29.904 [2024-07-12 15:57:50.166075] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:29.904 [2024-07-12 15:57:50.166215] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fed220 00:21:29.904 [2024-07-12 15:57:50.166313] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fec1d0 00:21:29.904 [2024-07-12 15:57:50.166319] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fec1d0 00:21:29.904 [2024-07-12 15:57:50.166386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:29.904 BaseBdev4 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:29.904 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:30.163 [ 00:21:30.163 { 00:21:30.163 "name": "BaseBdev4", 00:21:30.163 "aliases": [ 00:21:30.163 "51ebe8a4-85ef-4a71-a343-38863b283f7d" 00:21:30.163 ], 00:21:30.163 "product_name": "Malloc disk", 00:21:30.163 "block_size": 512, 00:21:30.163 "num_blocks": 65536, 00:21:30.163 "uuid": "51ebe8a4-85ef-4a71-a343-38863b283f7d", 00:21:30.163 "assigned_rate_limits": { 00:21:30.163 "rw_ios_per_sec": 0, 00:21:30.163 "rw_mbytes_per_sec": 0, 00:21:30.163 "r_mbytes_per_sec": 0, 00:21:30.163 "w_mbytes_per_sec": 0 00:21:30.163 }, 00:21:30.163 "claimed": true, 00:21:30.163 "claim_type": "exclusive_write", 00:21:30.163 "zoned": false, 00:21:30.163 "supported_io_types": { 00:21:30.163 "read": true, 00:21:30.163 "write": true, 00:21:30.163 "unmap": true, 00:21:30.163 "flush": true, 00:21:30.163 "reset": true, 00:21:30.163 "nvme_admin": false, 00:21:30.163 "nvme_io": false, 00:21:30.163 "nvme_io_md": false, 00:21:30.163 "write_zeroes": true, 00:21:30.163 "zcopy": true, 00:21:30.163 "get_zone_info": false, 00:21:30.163 "zone_management": false, 00:21:30.163 "zone_append": false, 00:21:30.163 "compare": false, 00:21:30.163 "compare_and_write": false, 00:21:30.163 "abort": true, 00:21:30.163 "seek_hole": false, 00:21:30.163 "seek_data": false, 00:21:30.163 "copy": true, 00:21:30.163 "nvme_iov_md": false 00:21:30.163 }, 00:21:30.163 "memory_domains": [ 00:21:30.163 { 00:21:30.163 "dma_device_id": "system", 00:21:30.163 "dma_device_type": 1 00:21:30.163 }, 00:21:30.163 { 00:21:30.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:30.163 "dma_device_type": 2 00:21:30.163 } 00:21:30.163 ], 00:21:30.163 "driver_specific": {} 00:21:30.163 } 00:21:30.163 ] 00:21:30.163 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:30.163 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.164 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.425 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.425 "name": "Existed_Raid", 00:21:30.425 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:30.425 "strip_size_kb": 0, 00:21:30.425 "state": "online", 00:21:30.425 "raid_level": "raid1", 00:21:30.425 "superblock": true, 00:21:30.425 "num_base_bdevs": 4, 00:21:30.425 "num_base_bdevs_discovered": 4, 00:21:30.425 "num_base_bdevs_operational": 4, 00:21:30.425 "base_bdevs_list": [ 00:21:30.425 { 00:21:30.425 "name": "BaseBdev1", 00:21:30.425 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:30.425 "is_configured": true, 00:21:30.425 "data_offset": 2048, 00:21:30.425 "data_size": 63488 00:21:30.425 }, 00:21:30.425 { 00:21:30.425 "name": "BaseBdev2", 00:21:30.425 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:30.425 "is_configured": true, 00:21:30.425 "data_offset": 2048, 00:21:30.425 "data_size": 63488 00:21:30.425 }, 00:21:30.425 { 00:21:30.425 "name": "BaseBdev3", 00:21:30.425 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:30.425 "is_configured": true, 00:21:30.425 "data_offset": 2048, 00:21:30.425 "data_size": 63488 00:21:30.425 }, 00:21:30.425 { 00:21:30.425 "name": "BaseBdev4", 00:21:30.425 "uuid": "51ebe8a4-85ef-4a71-a343-38863b283f7d", 00:21:30.425 "is_configured": true, 00:21:30.425 "data_offset": 2048, 00:21:30.425 "data_size": 63488 00:21:30.425 } 00:21:30.425 ] 00:21:30.425 }' 00:21:30.425 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.425 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:31.364 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:31.364 [2024-07-12 15:57:51.802335] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:31.624 "name": "Existed_Raid", 00:21:31.624 "aliases": [ 00:21:31.624 "217a5ea1-60a0-42bc-85f1-829155a90799" 00:21:31.624 ], 00:21:31.624 "product_name": "Raid Volume", 00:21:31.624 "block_size": 512, 00:21:31.624 "num_blocks": 63488, 00:21:31.624 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:31.624 "assigned_rate_limits": { 00:21:31.624 "rw_ios_per_sec": 0, 00:21:31.624 "rw_mbytes_per_sec": 0, 00:21:31.624 "r_mbytes_per_sec": 0, 00:21:31.624 "w_mbytes_per_sec": 0 00:21:31.624 }, 00:21:31.624 "claimed": false, 00:21:31.624 "zoned": false, 00:21:31.624 "supported_io_types": { 00:21:31.624 "read": true, 00:21:31.624 "write": true, 00:21:31.624 "unmap": false, 00:21:31.624 "flush": false, 00:21:31.624 "reset": true, 00:21:31.624 "nvme_admin": false, 00:21:31.624 "nvme_io": false, 00:21:31.624 "nvme_io_md": false, 00:21:31.624 "write_zeroes": true, 00:21:31.624 "zcopy": false, 00:21:31.624 "get_zone_info": false, 00:21:31.624 "zone_management": false, 00:21:31.624 "zone_append": false, 00:21:31.624 "compare": false, 00:21:31.624 "compare_and_write": false, 00:21:31.624 "abort": false, 00:21:31.624 "seek_hole": false, 00:21:31.624 "seek_data": false, 00:21:31.624 "copy": false, 00:21:31.624 "nvme_iov_md": false 00:21:31.624 }, 00:21:31.624 "memory_domains": [ 00:21:31.624 { 00:21:31.624 "dma_device_id": "system", 00:21:31.624 "dma_device_type": 1 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.624 "dma_device_type": 2 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "system", 00:21:31.624 "dma_device_type": 1 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.624 "dma_device_type": 2 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "system", 00:21:31.624 "dma_device_type": 1 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.624 "dma_device_type": 2 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "system", 00:21:31.624 "dma_device_type": 1 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.624 "dma_device_type": 2 00:21:31.624 } 00:21:31.624 ], 00:21:31.624 "driver_specific": { 00:21:31.624 "raid": { 00:21:31.624 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:31.624 "strip_size_kb": 0, 00:21:31.624 "state": "online", 00:21:31.624 "raid_level": "raid1", 00:21:31.624 "superblock": true, 00:21:31.624 "num_base_bdevs": 4, 00:21:31.624 "num_base_bdevs_discovered": 4, 00:21:31.624 "num_base_bdevs_operational": 4, 00:21:31.624 "base_bdevs_list": [ 00:21:31.624 { 00:21:31.624 "name": "BaseBdev1", 00:21:31.624 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:31.624 "is_configured": true, 00:21:31.624 "data_offset": 2048, 00:21:31.624 "data_size": 63488 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "name": "BaseBdev2", 00:21:31.624 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:31.624 "is_configured": true, 00:21:31.624 "data_offset": 2048, 00:21:31.624 "data_size": 63488 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "name": "BaseBdev3", 00:21:31.624 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:31.624 "is_configured": true, 00:21:31.624 "data_offset": 2048, 00:21:31.624 "data_size": 63488 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "name": "BaseBdev4", 00:21:31.624 "uuid": "51ebe8a4-85ef-4a71-a343-38863b283f7d", 00:21:31.624 "is_configured": true, 00:21:31.624 "data_offset": 2048, 00:21:31.624 "data_size": 63488 00:21:31.624 } 00:21:31.624 ] 00:21:31.624 } 00:21:31.624 } 00:21:31.624 }' 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:31.624 BaseBdev2 00:21:31.624 BaseBdev3 00:21:31.624 BaseBdev4' 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:31.624 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:31.624 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:31.624 "name": "BaseBdev1", 00:21:31.624 "aliases": [ 00:21:31.624 "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa" 00:21:31.624 ], 00:21:31.624 "product_name": "Malloc disk", 00:21:31.624 "block_size": 512, 00:21:31.624 "num_blocks": 65536, 00:21:31.624 "uuid": "7611e7a1-919a-4b2a-9f0e-13bcc37d4caa", 00:21:31.624 "assigned_rate_limits": { 00:21:31.624 "rw_ios_per_sec": 0, 00:21:31.624 "rw_mbytes_per_sec": 0, 00:21:31.624 "r_mbytes_per_sec": 0, 00:21:31.624 "w_mbytes_per_sec": 0 00:21:31.624 }, 00:21:31.624 "claimed": true, 00:21:31.624 "claim_type": "exclusive_write", 00:21:31.624 "zoned": false, 00:21:31.624 "supported_io_types": { 00:21:31.624 "read": true, 00:21:31.624 "write": true, 00:21:31.624 "unmap": true, 00:21:31.624 "flush": true, 00:21:31.624 "reset": true, 00:21:31.624 "nvme_admin": false, 00:21:31.624 "nvme_io": false, 00:21:31.624 "nvme_io_md": false, 00:21:31.624 "write_zeroes": true, 00:21:31.624 "zcopy": true, 00:21:31.624 "get_zone_info": false, 00:21:31.624 "zone_management": false, 00:21:31.624 "zone_append": false, 00:21:31.624 "compare": false, 00:21:31.624 "compare_and_write": false, 00:21:31.624 "abort": true, 00:21:31.624 "seek_hole": false, 00:21:31.624 "seek_data": false, 00:21:31.624 "copy": true, 00:21:31.624 "nvme_iov_md": false 00:21:31.624 }, 00:21:31.624 "memory_domains": [ 00:21:31.624 { 00:21:31.624 "dma_device_id": "system", 00:21:31.624 "dma_device_type": 1 00:21:31.624 }, 00:21:31.624 { 00:21:31.624 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:31.624 "dma_device_type": 2 00:21:31.624 } 00:21:31.624 ], 00:21:31.624 "driver_specific": {} 00:21:31.624 }' 00:21:31.624 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:31.884 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.145 "name": "BaseBdev2", 00:21:32.145 "aliases": [ 00:21:32.145 "aad190c3-2df6-4e97-8b03-e9b73d52f396" 00:21:32.145 ], 00:21:32.145 "product_name": "Malloc disk", 00:21:32.145 "block_size": 512, 00:21:32.145 "num_blocks": 65536, 00:21:32.145 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:32.145 "assigned_rate_limits": { 00:21:32.145 "rw_ios_per_sec": 0, 00:21:32.145 "rw_mbytes_per_sec": 0, 00:21:32.145 "r_mbytes_per_sec": 0, 00:21:32.145 "w_mbytes_per_sec": 0 00:21:32.145 }, 00:21:32.145 "claimed": true, 00:21:32.145 "claim_type": "exclusive_write", 00:21:32.145 "zoned": false, 00:21:32.145 "supported_io_types": { 00:21:32.145 "read": true, 00:21:32.145 "write": true, 00:21:32.145 "unmap": true, 00:21:32.145 "flush": true, 00:21:32.145 "reset": true, 00:21:32.145 "nvme_admin": false, 00:21:32.145 "nvme_io": false, 00:21:32.145 "nvme_io_md": false, 00:21:32.145 "write_zeroes": true, 00:21:32.145 "zcopy": true, 00:21:32.145 "get_zone_info": false, 00:21:32.145 "zone_management": false, 00:21:32.145 "zone_append": false, 00:21:32.145 "compare": false, 00:21:32.145 "compare_and_write": false, 00:21:32.145 "abort": true, 00:21:32.145 "seek_hole": false, 00:21:32.145 "seek_data": false, 00:21:32.145 "copy": true, 00:21:32.145 "nvme_iov_md": false 00:21:32.145 }, 00:21:32.145 "memory_domains": [ 00:21:32.145 { 00:21:32.145 "dma_device_id": "system", 00:21:32.145 "dma_device_type": 1 00:21:32.145 }, 00:21:32.145 { 00:21:32.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.145 "dma_device_type": 2 00:21:32.145 } 00:21:32.145 ], 00:21:32.145 "driver_specific": {} 00:21:32.145 }' 00:21:32.145 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.405 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:32.665 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:32.924 "name": "BaseBdev3", 00:21:32.924 "aliases": [ 00:21:32.924 "9c5db866-7c32-49eb-84ae-1e0de813481e" 00:21:32.924 ], 00:21:32.924 "product_name": "Malloc disk", 00:21:32.924 "block_size": 512, 00:21:32.924 "num_blocks": 65536, 00:21:32.924 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:32.924 "assigned_rate_limits": { 00:21:32.924 "rw_ios_per_sec": 0, 00:21:32.924 "rw_mbytes_per_sec": 0, 00:21:32.924 "r_mbytes_per_sec": 0, 00:21:32.924 "w_mbytes_per_sec": 0 00:21:32.924 }, 00:21:32.924 "claimed": true, 00:21:32.924 "claim_type": "exclusive_write", 00:21:32.924 "zoned": false, 00:21:32.924 "supported_io_types": { 00:21:32.924 "read": true, 00:21:32.924 "write": true, 00:21:32.924 "unmap": true, 00:21:32.924 "flush": true, 00:21:32.924 "reset": true, 00:21:32.924 "nvme_admin": false, 00:21:32.924 "nvme_io": false, 00:21:32.924 "nvme_io_md": false, 00:21:32.924 "write_zeroes": true, 00:21:32.924 "zcopy": true, 00:21:32.924 "get_zone_info": false, 00:21:32.924 "zone_management": false, 00:21:32.924 "zone_append": false, 00:21:32.924 "compare": false, 00:21:32.924 "compare_and_write": false, 00:21:32.924 "abort": true, 00:21:32.924 "seek_hole": false, 00:21:32.924 "seek_data": false, 00:21:32.924 "copy": true, 00:21:32.924 "nvme_iov_md": false 00:21:32.924 }, 00:21:32.924 "memory_domains": [ 00:21:32.924 { 00:21:32.924 "dma_device_id": "system", 00:21:32.924 "dma_device_type": 1 00:21:32.924 }, 00:21:32.924 { 00:21:32.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.924 "dma_device_type": 2 00:21:32.924 } 00:21:32.924 ], 00:21:32.924 "driver_specific": {} 00:21:32.924 }' 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:32.924 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:33.183 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.443 "name": "BaseBdev4", 00:21:33.443 "aliases": [ 00:21:33.443 "51ebe8a4-85ef-4a71-a343-38863b283f7d" 00:21:33.443 ], 00:21:33.443 "product_name": "Malloc disk", 00:21:33.443 "block_size": 512, 00:21:33.443 "num_blocks": 65536, 00:21:33.443 "uuid": "51ebe8a4-85ef-4a71-a343-38863b283f7d", 00:21:33.443 "assigned_rate_limits": { 00:21:33.443 "rw_ios_per_sec": 0, 00:21:33.443 "rw_mbytes_per_sec": 0, 00:21:33.443 "r_mbytes_per_sec": 0, 00:21:33.443 "w_mbytes_per_sec": 0 00:21:33.443 }, 00:21:33.443 "claimed": true, 00:21:33.443 "claim_type": "exclusive_write", 00:21:33.443 "zoned": false, 00:21:33.443 "supported_io_types": { 00:21:33.443 "read": true, 00:21:33.443 "write": true, 00:21:33.443 "unmap": true, 00:21:33.443 "flush": true, 00:21:33.443 "reset": true, 00:21:33.443 "nvme_admin": false, 00:21:33.443 "nvme_io": false, 00:21:33.443 "nvme_io_md": false, 00:21:33.443 "write_zeroes": true, 00:21:33.443 "zcopy": true, 00:21:33.443 "get_zone_info": false, 00:21:33.443 "zone_management": false, 00:21:33.443 "zone_append": false, 00:21:33.443 "compare": false, 00:21:33.443 "compare_and_write": false, 00:21:33.443 "abort": true, 00:21:33.443 "seek_hole": false, 00:21:33.443 "seek_data": false, 00:21:33.443 "copy": true, 00:21:33.443 "nvme_iov_md": false 00:21:33.443 }, 00:21:33.443 "memory_domains": [ 00:21:33.443 { 00:21:33.443 "dma_device_id": "system", 00:21:33.443 "dma_device_type": 1 00:21:33.443 }, 00:21:33.443 { 00:21:33.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.443 "dma_device_type": 2 00:21:33.443 } 00:21:33.443 ], 00:21:33.443 "driver_specific": {} 00:21:33.443 }' 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.443 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.703 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.703 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.703 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.703 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.703 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:33.962 [2024-07-12 15:57:54.172085] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.962 "name": "Existed_Raid", 00:21:33.962 "uuid": "217a5ea1-60a0-42bc-85f1-829155a90799", 00:21:33.962 "strip_size_kb": 0, 00:21:33.962 "state": "online", 00:21:33.962 "raid_level": "raid1", 00:21:33.962 "superblock": true, 00:21:33.962 "num_base_bdevs": 4, 00:21:33.962 "num_base_bdevs_discovered": 3, 00:21:33.962 "num_base_bdevs_operational": 3, 00:21:33.962 "base_bdevs_list": [ 00:21:33.962 { 00:21:33.962 "name": null, 00:21:33.962 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.962 "is_configured": false, 00:21:33.962 "data_offset": 2048, 00:21:33.962 "data_size": 63488 00:21:33.962 }, 00:21:33.962 { 00:21:33.962 "name": "BaseBdev2", 00:21:33.962 "uuid": "aad190c3-2df6-4e97-8b03-e9b73d52f396", 00:21:33.962 "is_configured": true, 00:21:33.962 "data_offset": 2048, 00:21:33.962 "data_size": 63488 00:21:33.962 }, 00:21:33.962 { 00:21:33.962 "name": "BaseBdev3", 00:21:33.962 "uuid": "9c5db866-7c32-49eb-84ae-1e0de813481e", 00:21:33.962 "is_configured": true, 00:21:33.962 "data_offset": 2048, 00:21:33.962 "data_size": 63488 00:21:33.962 }, 00:21:33.962 { 00:21:33.962 "name": "BaseBdev4", 00:21:33.962 "uuid": "51ebe8a4-85ef-4a71-a343-38863b283f7d", 00:21:33.962 "is_configured": true, 00:21:33.962 "data_offset": 2048, 00:21:33.962 "data_size": 63488 00:21:33.962 } 00:21:33.962 ] 00:21:33.962 }' 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.962 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.531 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:34.531 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:34.532 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:34.532 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:34.791 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:34.791 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:34.791 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:35.052 [2024-07-12 15:57:55.319002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:35.052 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:35.052 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:35.052 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.052 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:35.332 [2024-07-12 15:57:55.705689] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.332 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:35.591 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:35.591 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:35.591 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:35.851 [2024-07-12 15:57:56.092494] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:35.851 [2024-07-12 15:57:56.092552] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:35.851 [2024-07-12 15:57:56.098544] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:35.851 [2024-07-12 15:57:56.098568] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:35.851 [2024-07-12 15:57:56.098574] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fec1d0 name Existed_Raid, state offline 00:21:35.851 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:35.851 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:35.851 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.851 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:36.110 BaseBdev2 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.110 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:36.369 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:36.629 [ 00:21:36.629 { 00:21:36.629 "name": "BaseBdev2", 00:21:36.629 "aliases": [ 00:21:36.629 "c27cec81-329b-4058-be45-d2cde6294426" 00:21:36.629 ], 00:21:36.629 "product_name": "Malloc disk", 00:21:36.629 "block_size": 512, 00:21:36.629 "num_blocks": 65536, 00:21:36.629 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:36.629 "assigned_rate_limits": { 00:21:36.629 "rw_ios_per_sec": 0, 00:21:36.629 "rw_mbytes_per_sec": 0, 00:21:36.629 "r_mbytes_per_sec": 0, 00:21:36.629 "w_mbytes_per_sec": 0 00:21:36.629 }, 00:21:36.629 "claimed": false, 00:21:36.629 "zoned": false, 00:21:36.629 "supported_io_types": { 00:21:36.629 "read": true, 00:21:36.629 "write": true, 00:21:36.629 "unmap": true, 00:21:36.629 "flush": true, 00:21:36.629 "reset": true, 00:21:36.629 "nvme_admin": false, 00:21:36.629 "nvme_io": false, 00:21:36.629 "nvme_io_md": false, 00:21:36.629 "write_zeroes": true, 00:21:36.629 "zcopy": true, 00:21:36.629 "get_zone_info": false, 00:21:36.629 "zone_management": false, 00:21:36.629 "zone_append": false, 00:21:36.629 "compare": false, 00:21:36.629 "compare_and_write": false, 00:21:36.629 "abort": true, 00:21:36.629 "seek_hole": false, 00:21:36.629 "seek_data": false, 00:21:36.629 "copy": true, 00:21:36.629 "nvme_iov_md": false 00:21:36.629 }, 00:21:36.629 "memory_domains": [ 00:21:36.629 { 00:21:36.629 "dma_device_id": "system", 00:21:36.629 "dma_device_type": 1 00:21:36.629 }, 00:21:36.629 { 00:21:36.629 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.629 "dma_device_type": 2 00:21:36.629 } 00:21:36.629 ], 00:21:36.629 "driver_specific": {} 00:21:36.629 } 00:21:36.629 ] 00:21:36.629 15:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:36.629 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:36.629 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:36.629 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:36.889 BaseBdev3 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:36.889 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:37.150 [ 00:21:37.150 { 00:21:37.150 "name": "BaseBdev3", 00:21:37.150 "aliases": [ 00:21:37.150 "6c73cad1-e165-492e-a44e-893a8da13e6f" 00:21:37.150 ], 00:21:37.150 "product_name": "Malloc disk", 00:21:37.150 "block_size": 512, 00:21:37.150 "num_blocks": 65536, 00:21:37.150 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:37.150 "assigned_rate_limits": { 00:21:37.150 "rw_ios_per_sec": 0, 00:21:37.150 "rw_mbytes_per_sec": 0, 00:21:37.150 "r_mbytes_per_sec": 0, 00:21:37.150 "w_mbytes_per_sec": 0 00:21:37.150 }, 00:21:37.150 "claimed": false, 00:21:37.150 "zoned": false, 00:21:37.150 "supported_io_types": { 00:21:37.150 "read": true, 00:21:37.150 "write": true, 00:21:37.150 "unmap": true, 00:21:37.150 "flush": true, 00:21:37.150 "reset": true, 00:21:37.150 "nvme_admin": false, 00:21:37.150 "nvme_io": false, 00:21:37.150 "nvme_io_md": false, 00:21:37.150 "write_zeroes": true, 00:21:37.150 "zcopy": true, 00:21:37.150 "get_zone_info": false, 00:21:37.150 "zone_management": false, 00:21:37.150 "zone_append": false, 00:21:37.150 "compare": false, 00:21:37.150 "compare_and_write": false, 00:21:37.150 "abort": true, 00:21:37.150 "seek_hole": false, 00:21:37.150 "seek_data": false, 00:21:37.150 "copy": true, 00:21:37.150 "nvme_iov_md": false 00:21:37.150 }, 00:21:37.150 "memory_domains": [ 00:21:37.150 { 00:21:37.150 "dma_device_id": "system", 00:21:37.150 "dma_device_type": 1 00:21:37.150 }, 00:21:37.150 { 00:21:37.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.150 "dma_device_type": 2 00:21:37.150 } 00:21:37.150 ], 00:21:37.150 "driver_specific": {} 00:21:37.150 } 00:21:37.150 ] 00:21:37.150 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:37.150 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:37.150 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:37.150 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:37.410 BaseBdev4 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:37.410 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:37.671 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:37.671 [ 00:21:37.671 { 00:21:37.671 "name": "BaseBdev4", 00:21:37.671 "aliases": [ 00:21:37.671 "37fafefe-57cf-438d-9169-e8b968964ca8" 00:21:37.671 ], 00:21:37.671 "product_name": "Malloc disk", 00:21:37.671 "block_size": 512, 00:21:37.671 "num_blocks": 65536, 00:21:37.671 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:37.671 "assigned_rate_limits": { 00:21:37.671 "rw_ios_per_sec": 0, 00:21:37.671 "rw_mbytes_per_sec": 0, 00:21:37.671 "r_mbytes_per_sec": 0, 00:21:37.671 "w_mbytes_per_sec": 0 00:21:37.671 }, 00:21:37.671 "claimed": false, 00:21:37.671 "zoned": false, 00:21:37.671 "supported_io_types": { 00:21:37.671 "read": true, 00:21:37.671 "write": true, 00:21:37.671 "unmap": true, 00:21:37.671 "flush": true, 00:21:37.671 "reset": true, 00:21:37.671 "nvme_admin": false, 00:21:37.671 "nvme_io": false, 00:21:37.671 "nvme_io_md": false, 00:21:37.671 "write_zeroes": true, 00:21:37.671 "zcopy": true, 00:21:37.671 "get_zone_info": false, 00:21:37.671 "zone_management": false, 00:21:37.671 "zone_append": false, 00:21:37.671 "compare": false, 00:21:37.671 "compare_and_write": false, 00:21:37.671 "abort": true, 00:21:37.671 "seek_hole": false, 00:21:37.671 "seek_data": false, 00:21:37.671 "copy": true, 00:21:37.671 "nvme_iov_md": false 00:21:37.671 }, 00:21:37.671 "memory_domains": [ 00:21:37.671 { 00:21:37.671 "dma_device_id": "system", 00:21:37.671 "dma_device_type": 1 00:21:37.671 }, 00:21:37.671 { 00:21:37.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.671 "dma_device_type": 2 00:21:37.671 } 00:21:37.671 ], 00:21:37.671 "driver_specific": {} 00:21:37.671 } 00:21:37.671 ] 00:21:37.671 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:37.671 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:37.671 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:37.671 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:37.933 [2024-07-12 15:57:58.255505] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:37.933 [2024-07-12 15:57:58.255532] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:37.933 [2024-07-12 15:57:58.255545] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:37.933 [2024-07-12 15:57:58.256607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:37.933 [2024-07-12 15:57:58.256638] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:37.933 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:38.193 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:38.193 "name": "Existed_Raid", 00:21:38.193 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:38.193 "strip_size_kb": 0, 00:21:38.193 "state": "configuring", 00:21:38.193 "raid_level": "raid1", 00:21:38.193 "superblock": true, 00:21:38.193 "num_base_bdevs": 4, 00:21:38.193 "num_base_bdevs_discovered": 3, 00:21:38.193 "num_base_bdevs_operational": 4, 00:21:38.193 "base_bdevs_list": [ 00:21:38.193 { 00:21:38.193 "name": "BaseBdev1", 00:21:38.193 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:38.193 "is_configured": false, 00:21:38.193 "data_offset": 0, 00:21:38.193 "data_size": 0 00:21:38.193 }, 00:21:38.193 { 00:21:38.193 "name": "BaseBdev2", 00:21:38.193 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:38.193 "is_configured": true, 00:21:38.193 "data_offset": 2048, 00:21:38.193 "data_size": 63488 00:21:38.193 }, 00:21:38.193 { 00:21:38.193 "name": "BaseBdev3", 00:21:38.193 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:38.193 "is_configured": true, 00:21:38.193 "data_offset": 2048, 00:21:38.193 "data_size": 63488 00:21:38.193 }, 00:21:38.193 { 00:21:38.193 "name": "BaseBdev4", 00:21:38.193 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:38.193 "is_configured": true, 00:21:38.193 "data_offset": 2048, 00:21:38.193 "data_size": 63488 00:21:38.193 } 00:21:38.193 ] 00:21:38.193 }' 00:21:38.193 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:38.193 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:38.762 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:38.762 [2024-07-12 15:57:59.197863] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.023 "name": "Existed_Raid", 00:21:39.023 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:39.023 "strip_size_kb": 0, 00:21:39.023 "state": "configuring", 00:21:39.023 "raid_level": "raid1", 00:21:39.023 "superblock": true, 00:21:39.023 "num_base_bdevs": 4, 00:21:39.023 "num_base_bdevs_discovered": 2, 00:21:39.023 "num_base_bdevs_operational": 4, 00:21:39.023 "base_bdevs_list": [ 00:21:39.023 { 00:21:39.023 "name": "BaseBdev1", 00:21:39.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.023 "is_configured": false, 00:21:39.023 "data_offset": 0, 00:21:39.023 "data_size": 0 00:21:39.023 }, 00:21:39.023 { 00:21:39.023 "name": null, 00:21:39.023 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:39.023 "is_configured": false, 00:21:39.023 "data_offset": 2048, 00:21:39.023 "data_size": 63488 00:21:39.023 }, 00:21:39.023 { 00:21:39.023 "name": "BaseBdev3", 00:21:39.023 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:39.023 "is_configured": true, 00:21:39.023 "data_offset": 2048, 00:21:39.023 "data_size": 63488 00:21:39.023 }, 00:21:39.023 { 00:21:39.023 "name": "BaseBdev4", 00:21:39.023 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:39.023 "is_configured": true, 00:21:39.023 "data_offset": 2048, 00:21:39.023 "data_size": 63488 00:21:39.023 } 00:21:39.023 ] 00:21:39.023 }' 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.023 15:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:39.626 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.626 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:39.946 [2024-07-12 15:58:00.337670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:39.946 BaseBdev1 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:39.946 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:40.207 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:40.467 [ 00:21:40.467 { 00:21:40.467 "name": "BaseBdev1", 00:21:40.467 "aliases": [ 00:21:40.467 "4715ba63-6fab-4d92-a270-8e06c607a338" 00:21:40.467 ], 00:21:40.467 "product_name": "Malloc disk", 00:21:40.467 "block_size": 512, 00:21:40.467 "num_blocks": 65536, 00:21:40.467 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:40.467 "assigned_rate_limits": { 00:21:40.467 "rw_ios_per_sec": 0, 00:21:40.467 "rw_mbytes_per_sec": 0, 00:21:40.467 "r_mbytes_per_sec": 0, 00:21:40.467 "w_mbytes_per_sec": 0 00:21:40.467 }, 00:21:40.467 "claimed": true, 00:21:40.467 "claim_type": "exclusive_write", 00:21:40.467 "zoned": false, 00:21:40.467 "supported_io_types": { 00:21:40.467 "read": true, 00:21:40.467 "write": true, 00:21:40.467 "unmap": true, 00:21:40.467 "flush": true, 00:21:40.467 "reset": true, 00:21:40.467 "nvme_admin": false, 00:21:40.467 "nvme_io": false, 00:21:40.467 "nvme_io_md": false, 00:21:40.467 "write_zeroes": true, 00:21:40.467 "zcopy": true, 00:21:40.467 "get_zone_info": false, 00:21:40.467 "zone_management": false, 00:21:40.467 "zone_append": false, 00:21:40.467 "compare": false, 00:21:40.467 "compare_and_write": false, 00:21:40.467 "abort": true, 00:21:40.467 "seek_hole": false, 00:21:40.467 "seek_data": false, 00:21:40.467 "copy": true, 00:21:40.467 "nvme_iov_md": false 00:21:40.467 }, 00:21:40.467 "memory_domains": [ 00:21:40.467 { 00:21:40.467 "dma_device_id": "system", 00:21:40.467 "dma_device_type": 1 00:21:40.467 }, 00:21:40.467 { 00:21:40.467 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:40.467 "dma_device_type": 2 00:21:40.467 } 00:21:40.467 ], 00:21:40.467 "driver_specific": {} 00:21:40.467 } 00:21:40.467 ] 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.467 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.727 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.727 "name": "Existed_Raid", 00:21:40.727 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:40.727 "strip_size_kb": 0, 00:21:40.727 "state": "configuring", 00:21:40.727 "raid_level": "raid1", 00:21:40.727 "superblock": true, 00:21:40.727 "num_base_bdevs": 4, 00:21:40.727 "num_base_bdevs_discovered": 3, 00:21:40.727 "num_base_bdevs_operational": 4, 00:21:40.727 "base_bdevs_list": [ 00:21:40.727 { 00:21:40.727 "name": "BaseBdev1", 00:21:40.727 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:40.727 "is_configured": true, 00:21:40.727 "data_offset": 2048, 00:21:40.727 "data_size": 63488 00:21:40.727 }, 00:21:40.727 { 00:21:40.727 "name": null, 00:21:40.727 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:40.727 "is_configured": false, 00:21:40.727 "data_offset": 2048, 00:21:40.728 "data_size": 63488 00:21:40.728 }, 00:21:40.728 { 00:21:40.728 "name": "BaseBdev3", 00:21:40.728 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:40.728 "is_configured": true, 00:21:40.728 "data_offset": 2048, 00:21:40.728 "data_size": 63488 00:21:40.728 }, 00:21:40.728 { 00:21:40.728 "name": "BaseBdev4", 00:21:40.728 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:40.728 "is_configured": true, 00:21:40.728 "data_offset": 2048, 00:21:40.728 "data_size": 63488 00:21:40.728 } 00:21:40.728 ] 00:21:40.728 }' 00:21:40.728 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.728 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:41.299 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.299 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:41.299 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:41.299 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:41.559 [2024-07-12 15:58:01.841561] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.559 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:41.818 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.818 "name": "Existed_Raid", 00:21:41.818 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:41.818 "strip_size_kb": 0, 00:21:41.818 "state": "configuring", 00:21:41.818 "raid_level": "raid1", 00:21:41.818 "superblock": true, 00:21:41.818 "num_base_bdevs": 4, 00:21:41.818 "num_base_bdevs_discovered": 2, 00:21:41.818 "num_base_bdevs_operational": 4, 00:21:41.818 "base_bdevs_list": [ 00:21:41.818 { 00:21:41.818 "name": "BaseBdev1", 00:21:41.818 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:41.818 "is_configured": true, 00:21:41.818 "data_offset": 2048, 00:21:41.818 "data_size": 63488 00:21:41.818 }, 00:21:41.818 { 00:21:41.818 "name": null, 00:21:41.818 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:41.818 "is_configured": false, 00:21:41.818 "data_offset": 2048, 00:21:41.818 "data_size": 63488 00:21:41.818 }, 00:21:41.818 { 00:21:41.818 "name": null, 00:21:41.818 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:41.818 "is_configured": false, 00:21:41.818 "data_offset": 2048, 00:21:41.818 "data_size": 63488 00:21:41.818 }, 00:21:41.818 { 00:21:41.818 "name": "BaseBdev4", 00:21:41.818 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:41.818 "is_configured": true, 00:21:41.818 "data_offset": 2048, 00:21:41.818 "data_size": 63488 00:21:41.818 } 00:21:41.818 ] 00:21:41.818 }' 00:21:41.818 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.818 15:58:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:42.388 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.388 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:42.388 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:42.388 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:42.647 [2024-07-12 15:58:02.936356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.647 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:42.907 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:42.907 "name": "Existed_Raid", 00:21:42.907 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:42.907 "strip_size_kb": 0, 00:21:42.907 "state": "configuring", 00:21:42.907 "raid_level": "raid1", 00:21:42.907 "superblock": true, 00:21:42.907 "num_base_bdevs": 4, 00:21:42.907 "num_base_bdevs_discovered": 3, 00:21:42.907 "num_base_bdevs_operational": 4, 00:21:42.907 "base_bdevs_list": [ 00:21:42.907 { 00:21:42.907 "name": "BaseBdev1", 00:21:42.907 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:42.907 "is_configured": true, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 }, 00:21:42.907 { 00:21:42.907 "name": null, 00:21:42.907 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:42.907 "is_configured": false, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 }, 00:21:42.907 { 00:21:42.907 "name": "BaseBdev3", 00:21:42.907 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:42.907 "is_configured": true, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 }, 00:21:42.907 { 00:21:42.907 "name": "BaseBdev4", 00:21:42.907 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:42.907 "is_configured": true, 00:21:42.907 "data_offset": 2048, 00:21:42.907 "data_size": 63488 00:21:42.907 } 00:21:42.907 ] 00:21:42.907 }' 00:21:42.907 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:42.907 15:58:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:43.476 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:43.476 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.476 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:43.476 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:43.735 [2024-07-12 15:58:04.035138] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.735 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:43.994 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.994 "name": "Existed_Raid", 00:21:43.994 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:43.994 "strip_size_kb": 0, 00:21:43.994 "state": "configuring", 00:21:43.994 "raid_level": "raid1", 00:21:43.994 "superblock": true, 00:21:43.994 "num_base_bdevs": 4, 00:21:43.994 "num_base_bdevs_discovered": 2, 00:21:43.994 "num_base_bdevs_operational": 4, 00:21:43.994 "base_bdevs_list": [ 00:21:43.994 { 00:21:43.994 "name": null, 00:21:43.995 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:43.995 "is_configured": false, 00:21:43.995 "data_offset": 2048, 00:21:43.995 "data_size": 63488 00:21:43.995 }, 00:21:43.995 { 00:21:43.995 "name": null, 00:21:43.995 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:43.995 "is_configured": false, 00:21:43.995 "data_offset": 2048, 00:21:43.995 "data_size": 63488 00:21:43.995 }, 00:21:43.995 { 00:21:43.995 "name": "BaseBdev3", 00:21:43.995 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:43.995 "is_configured": true, 00:21:43.995 "data_offset": 2048, 00:21:43.995 "data_size": 63488 00:21:43.995 }, 00:21:43.995 { 00:21:43.995 "name": "BaseBdev4", 00:21:43.995 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:43.995 "is_configured": true, 00:21:43.995 "data_offset": 2048, 00:21:43.995 "data_size": 63488 00:21:43.995 } 00:21:43.995 ] 00:21:43.995 }' 00:21:43.995 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.995 15:58:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:44.564 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.564 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:44.564 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:44.564 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:44.825 [2024-07-12 15:58:05.143771] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.825 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.086 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.086 "name": "Existed_Raid", 00:21:45.086 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:45.086 "strip_size_kb": 0, 00:21:45.086 "state": "configuring", 00:21:45.086 "raid_level": "raid1", 00:21:45.086 "superblock": true, 00:21:45.086 "num_base_bdevs": 4, 00:21:45.086 "num_base_bdevs_discovered": 3, 00:21:45.086 "num_base_bdevs_operational": 4, 00:21:45.086 "base_bdevs_list": [ 00:21:45.086 { 00:21:45.086 "name": null, 00:21:45.086 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:45.086 "is_configured": false, 00:21:45.086 "data_offset": 2048, 00:21:45.086 "data_size": 63488 00:21:45.086 }, 00:21:45.086 { 00:21:45.086 "name": "BaseBdev2", 00:21:45.086 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:45.086 "is_configured": true, 00:21:45.086 "data_offset": 2048, 00:21:45.086 "data_size": 63488 00:21:45.086 }, 00:21:45.086 { 00:21:45.086 "name": "BaseBdev3", 00:21:45.086 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:45.086 "is_configured": true, 00:21:45.086 "data_offset": 2048, 00:21:45.086 "data_size": 63488 00:21:45.086 }, 00:21:45.086 { 00:21:45.086 "name": "BaseBdev4", 00:21:45.086 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:45.086 "is_configured": true, 00:21:45.086 "data_offset": 2048, 00:21:45.086 "data_size": 63488 00:21:45.086 } 00:21:45.086 ] 00:21:45.086 }' 00:21:45.086 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.086 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:45.656 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.656 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:45.656 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:45.656 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.656 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:45.917 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4715ba63-6fab-4d92-a270-8e06c607a338 00:21:46.177 [2024-07-12 15:58:06.464087] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:46.177 [2024-07-12 15:58:06.464205] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fefa70 00:21:46.177 [2024-07-12 15:58:06.464217] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:46.177 [2024-07-12 15:58:06.464350] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fe3a90 00:21:46.177 [2024-07-12 15:58:06.464444] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fefa70 00:21:46.177 [2024-07-12 15:58:06.464449] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fefa70 00:21:46.177 [2024-07-12 15:58:06.464520] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:46.177 NewBaseBdev 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:46.177 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:46.437 [ 00:21:46.437 { 00:21:46.437 "name": "NewBaseBdev", 00:21:46.437 "aliases": [ 00:21:46.437 "4715ba63-6fab-4d92-a270-8e06c607a338" 00:21:46.437 ], 00:21:46.437 "product_name": "Malloc disk", 00:21:46.437 "block_size": 512, 00:21:46.437 "num_blocks": 65536, 00:21:46.437 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:46.437 "assigned_rate_limits": { 00:21:46.437 "rw_ios_per_sec": 0, 00:21:46.437 "rw_mbytes_per_sec": 0, 00:21:46.437 "r_mbytes_per_sec": 0, 00:21:46.437 "w_mbytes_per_sec": 0 00:21:46.437 }, 00:21:46.437 "claimed": true, 00:21:46.437 "claim_type": "exclusive_write", 00:21:46.437 "zoned": false, 00:21:46.437 "supported_io_types": { 00:21:46.437 "read": true, 00:21:46.437 "write": true, 00:21:46.437 "unmap": true, 00:21:46.437 "flush": true, 00:21:46.437 "reset": true, 00:21:46.437 "nvme_admin": false, 00:21:46.437 "nvme_io": false, 00:21:46.437 "nvme_io_md": false, 00:21:46.437 "write_zeroes": true, 00:21:46.437 "zcopy": true, 00:21:46.437 "get_zone_info": false, 00:21:46.437 "zone_management": false, 00:21:46.437 "zone_append": false, 00:21:46.437 "compare": false, 00:21:46.437 "compare_and_write": false, 00:21:46.437 "abort": true, 00:21:46.437 "seek_hole": false, 00:21:46.437 "seek_data": false, 00:21:46.437 "copy": true, 00:21:46.437 "nvme_iov_md": false 00:21:46.437 }, 00:21:46.437 "memory_domains": [ 00:21:46.437 { 00:21:46.437 "dma_device_id": "system", 00:21:46.437 "dma_device_type": 1 00:21:46.437 }, 00:21:46.437 { 00:21:46.437 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.437 "dma_device_type": 2 00:21:46.437 } 00:21:46.437 ], 00:21:46.437 "driver_specific": {} 00:21:46.437 } 00:21:46.437 ] 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.437 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.697 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.697 "name": "Existed_Raid", 00:21:46.697 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:46.697 "strip_size_kb": 0, 00:21:46.697 "state": "online", 00:21:46.697 "raid_level": "raid1", 00:21:46.697 "superblock": true, 00:21:46.697 "num_base_bdevs": 4, 00:21:46.697 "num_base_bdevs_discovered": 4, 00:21:46.697 "num_base_bdevs_operational": 4, 00:21:46.697 "base_bdevs_list": [ 00:21:46.697 { 00:21:46.697 "name": "NewBaseBdev", 00:21:46.697 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:46.697 "is_configured": true, 00:21:46.697 "data_offset": 2048, 00:21:46.697 "data_size": 63488 00:21:46.697 }, 00:21:46.697 { 00:21:46.697 "name": "BaseBdev2", 00:21:46.697 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:46.697 "is_configured": true, 00:21:46.697 "data_offset": 2048, 00:21:46.697 "data_size": 63488 00:21:46.697 }, 00:21:46.697 { 00:21:46.697 "name": "BaseBdev3", 00:21:46.697 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:46.697 "is_configured": true, 00:21:46.697 "data_offset": 2048, 00:21:46.697 "data_size": 63488 00:21:46.697 }, 00:21:46.697 { 00:21:46.697 "name": "BaseBdev4", 00:21:46.697 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:46.697 "is_configured": true, 00:21:46.697 "data_offset": 2048, 00:21:46.697 "data_size": 63488 00:21:46.697 } 00:21:46.697 ] 00:21:46.697 }' 00:21:46.697 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.697 15:58:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:47.269 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:47.528 [2024-07-12 15:58:07.763628] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:47.528 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:47.528 "name": "Existed_Raid", 00:21:47.528 "aliases": [ 00:21:47.528 "387a4294-6e85-4363-a145-967fcb3ff28a" 00:21:47.528 ], 00:21:47.528 "product_name": "Raid Volume", 00:21:47.528 "block_size": 512, 00:21:47.528 "num_blocks": 63488, 00:21:47.528 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:47.528 "assigned_rate_limits": { 00:21:47.528 "rw_ios_per_sec": 0, 00:21:47.529 "rw_mbytes_per_sec": 0, 00:21:47.529 "r_mbytes_per_sec": 0, 00:21:47.529 "w_mbytes_per_sec": 0 00:21:47.529 }, 00:21:47.529 "claimed": false, 00:21:47.529 "zoned": false, 00:21:47.529 "supported_io_types": { 00:21:47.529 "read": true, 00:21:47.529 "write": true, 00:21:47.529 "unmap": false, 00:21:47.529 "flush": false, 00:21:47.529 "reset": true, 00:21:47.529 "nvme_admin": false, 00:21:47.529 "nvme_io": false, 00:21:47.529 "nvme_io_md": false, 00:21:47.529 "write_zeroes": true, 00:21:47.529 "zcopy": false, 00:21:47.529 "get_zone_info": false, 00:21:47.529 "zone_management": false, 00:21:47.529 "zone_append": false, 00:21:47.529 "compare": false, 00:21:47.529 "compare_and_write": false, 00:21:47.529 "abort": false, 00:21:47.529 "seek_hole": false, 00:21:47.529 "seek_data": false, 00:21:47.529 "copy": false, 00:21:47.529 "nvme_iov_md": false 00:21:47.529 }, 00:21:47.529 "memory_domains": [ 00:21:47.529 { 00:21:47.529 "dma_device_id": "system", 00:21:47.529 "dma_device_type": 1 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.529 "dma_device_type": 2 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "system", 00:21:47.529 "dma_device_type": 1 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.529 "dma_device_type": 2 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "system", 00:21:47.529 "dma_device_type": 1 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.529 "dma_device_type": 2 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "system", 00:21:47.529 "dma_device_type": 1 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.529 "dma_device_type": 2 00:21:47.529 } 00:21:47.529 ], 00:21:47.529 "driver_specific": { 00:21:47.529 "raid": { 00:21:47.529 "uuid": "387a4294-6e85-4363-a145-967fcb3ff28a", 00:21:47.529 "strip_size_kb": 0, 00:21:47.529 "state": "online", 00:21:47.529 "raid_level": "raid1", 00:21:47.529 "superblock": true, 00:21:47.529 "num_base_bdevs": 4, 00:21:47.529 "num_base_bdevs_discovered": 4, 00:21:47.529 "num_base_bdevs_operational": 4, 00:21:47.529 "base_bdevs_list": [ 00:21:47.529 { 00:21:47.529 "name": "NewBaseBdev", 00:21:47.529 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:47.529 "is_configured": true, 00:21:47.529 "data_offset": 2048, 00:21:47.529 "data_size": 63488 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "name": "BaseBdev2", 00:21:47.529 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:47.529 "is_configured": true, 00:21:47.529 "data_offset": 2048, 00:21:47.529 "data_size": 63488 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "name": "BaseBdev3", 00:21:47.529 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:47.529 "is_configured": true, 00:21:47.529 "data_offset": 2048, 00:21:47.529 "data_size": 63488 00:21:47.529 }, 00:21:47.529 { 00:21:47.529 "name": "BaseBdev4", 00:21:47.529 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:47.529 "is_configured": true, 00:21:47.529 "data_offset": 2048, 00:21:47.529 "data_size": 63488 00:21:47.529 } 00:21:47.529 ] 00:21:47.529 } 00:21:47.529 } 00:21:47.529 }' 00:21:47.529 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:47.529 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:47.529 BaseBdev2 00:21:47.529 BaseBdev3 00:21:47.529 BaseBdev4' 00:21:47.529 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:47.529 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:47.529 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:47.789 "name": "NewBaseBdev", 00:21:47.789 "aliases": [ 00:21:47.789 "4715ba63-6fab-4d92-a270-8e06c607a338" 00:21:47.789 ], 00:21:47.789 "product_name": "Malloc disk", 00:21:47.789 "block_size": 512, 00:21:47.789 "num_blocks": 65536, 00:21:47.789 "uuid": "4715ba63-6fab-4d92-a270-8e06c607a338", 00:21:47.789 "assigned_rate_limits": { 00:21:47.789 "rw_ios_per_sec": 0, 00:21:47.789 "rw_mbytes_per_sec": 0, 00:21:47.789 "r_mbytes_per_sec": 0, 00:21:47.789 "w_mbytes_per_sec": 0 00:21:47.789 }, 00:21:47.789 "claimed": true, 00:21:47.789 "claim_type": "exclusive_write", 00:21:47.789 "zoned": false, 00:21:47.789 "supported_io_types": { 00:21:47.789 "read": true, 00:21:47.789 "write": true, 00:21:47.789 "unmap": true, 00:21:47.789 "flush": true, 00:21:47.789 "reset": true, 00:21:47.789 "nvme_admin": false, 00:21:47.789 "nvme_io": false, 00:21:47.789 "nvme_io_md": false, 00:21:47.789 "write_zeroes": true, 00:21:47.789 "zcopy": true, 00:21:47.789 "get_zone_info": false, 00:21:47.789 "zone_management": false, 00:21:47.789 "zone_append": false, 00:21:47.789 "compare": false, 00:21:47.789 "compare_and_write": false, 00:21:47.789 "abort": true, 00:21:47.789 "seek_hole": false, 00:21:47.789 "seek_data": false, 00:21:47.789 "copy": true, 00:21:47.789 "nvme_iov_md": false 00:21:47.789 }, 00:21:47.789 "memory_domains": [ 00:21:47.789 { 00:21:47.789 "dma_device_id": "system", 00:21:47.789 "dma_device_type": 1 00:21:47.789 }, 00:21:47.789 { 00:21:47.789 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:47.789 "dma_device_type": 2 00:21:47.789 } 00:21:47.789 ], 00:21:47.789 "driver_specific": {} 00:21:47.789 }' 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:47.789 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:48.050 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:48.311 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:48.311 "name": "BaseBdev2", 00:21:48.311 "aliases": [ 00:21:48.311 "c27cec81-329b-4058-be45-d2cde6294426" 00:21:48.311 ], 00:21:48.311 "product_name": "Malloc disk", 00:21:48.311 "block_size": 512, 00:21:48.311 "num_blocks": 65536, 00:21:48.311 "uuid": "c27cec81-329b-4058-be45-d2cde6294426", 00:21:48.311 "assigned_rate_limits": { 00:21:48.311 "rw_ios_per_sec": 0, 00:21:48.311 "rw_mbytes_per_sec": 0, 00:21:48.311 "r_mbytes_per_sec": 0, 00:21:48.311 "w_mbytes_per_sec": 0 00:21:48.312 }, 00:21:48.312 "claimed": true, 00:21:48.312 "claim_type": "exclusive_write", 00:21:48.312 "zoned": false, 00:21:48.312 "supported_io_types": { 00:21:48.312 "read": true, 00:21:48.312 "write": true, 00:21:48.312 "unmap": true, 00:21:48.312 "flush": true, 00:21:48.312 "reset": true, 00:21:48.312 "nvme_admin": false, 00:21:48.312 "nvme_io": false, 00:21:48.312 "nvme_io_md": false, 00:21:48.312 "write_zeroes": true, 00:21:48.312 "zcopy": true, 00:21:48.312 "get_zone_info": false, 00:21:48.312 "zone_management": false, 00:21:48.312 "zone_append": false, 00:21:48.312 "compare": false, 00:21:48.312 "compare_and_write": false, 00:21:48.312 "abort": true, 00:21:48.312 "seek_hole": false, 00:21:48.312 "seek_data": false, 00:21:48.312 "copy": true, 00:21:48.312 "nvme_iov_md": false 00:21:48.312 }, 00:21:48.312 "memory_domains": [ 00:21:48.312 { 00:21:48.312 "dma_device_id": "system", 00:21:48.312 "dma_device_type": 1 00:21:48.312 }, 00:21:48.312 { 00:21:48.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.312 "dma_device_type": 2 00:21:48.312 } 00:21:48.312 ], 00:21:48.312 "driver_specific": {} 00:21:48.312 }' 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:48.312 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:48.573 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:48.833 "name": "BaseBdev3", 00:21:48.833 "aliases": [ 00:21:48.833 "6c73cad1-e165-492e-a44e-893a8da13e6f" 00:21:48.833 ], 00:21:48.833 "product_name": "Malloc disk", 00:21:48.833 "block_size": 512, 00:21:48.833 "num_blocks": 65536, 00:21:48.833 "uuid": "6c73cad1-e165-492e-a44e-893a8da13e6f", 00:21:48.833 "assigned_rate_limits": { 00:21:48.833 "rw_ios_per_sec": 0, 00:21:48.833 "rw_mbytes_per_sec": 0, 00:21:48.833 "r_mbytes_per_sec": 0, 00:21:48.833 "w_mbytes_per_sec": 0 00:21:48.833 }, 00:21:48.833 "claimed": true, 00:21:48.833 "claim_type": "exclusive_write", 00:21:48.833 "zoned": false, 00:21:48.833 "supported_io_types": { 00:21:48.833 "read": true, 00:21:48.833 "write": true, 00:21:48.833 "unmap": true, 00:21:48.833 "flush": true, 00:21:48.833 "reset": true, 00:21:48.833 "nvme_admin": false, 00:21:48.833 "nvme_io": false, 00:21:48.833 "nvme_io_md": false, 00:21:48.833 "write_zeroes": true, 00:21:48.833 "zcopy": true, 00:21:48.833 "get_zone_info": false, 00:21:48.833 "zone_management": false, 00:21:48.833 "zone_append": false, 00:21:48.833 "compare": false, 00:21:48.833 "compare_and_write": false, 00:21:48.833 "abort": true, 00:21:48.833 "seek_hole": false, 00:21:48.833 "seek_data": false, 00:21:48.833 "copy": true, 00:21:48.833 "nvme_iov_md": false 00:21:48.833 }, 00:21:48.833 "memory_domains": [ 00:21:48.833 { 00:21:48.833 "dma_device_id": "system", 00:21:48.833 "dma_device_type": 1 00:21:48.833 }, 00:21:48.833 { 00:21:48.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.833 "dma_device_type": 2 00:21:48.833 } 00:21:48.833 ], 00:21:48.833 "driver_specific": {} 00:21:48.833 }' 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:48.833 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:49.093 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:49.353 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:49.353 "name": "BaseBdev4", 00:21:49.353 "aliases": [ 00:21:49.353 "37fafefe-57cf-438d-9169-e8b968964ca8" 00:21:49.353 ], 00:21:49.353 "product_name": "Malloc disk", 00:21:49.353 "block_size": 512, 00:21:49.353 "num_blocks": 65536, 00:21:49.353 "uuid": "37fafefe-57cf-438d-9169-e8b968964ca8", 00:21:49.353 "assigned_rate_limits": { 00:21:49.353 "rw_ios_per_sec": 0, 00:21:49.353 "rw_mbytes_per_sec": 0, 00:21:49.353 "r_mbytes_per_sec": 0, 00:21:49.353 "w_mbytes_per_sec": 0 00:21:49.353 }, 00:21:49.353 "claimed": true, 00:21:49.353 "claim_type": "exclusive_write", 00:21:49.353 "zoned": false, 00:21:49.353 "supported_io_types": { 00:21:49.353 "read": true, 00:21:49.353 "write": true, 00:21:49.353 "unmap": true, 00:21:49.353 "flush": true, 00:21:49.353 "reset": true, 00:21:49.353 "nvme_admin": false, 00:21:49.353 "nvme_io": false, 00:21:49.353 "nvme_io_md": false, 00:21:49.353 "write_zeroes": true, 00:21:49.353 "zcopy": true, 00:21:49.353 "get_zone_info": false, 00:21:49.353 "zone_management": false, 00:21:49.353 "zone_append": false, 00:21:49.353 "compare": false, 00:21:49.353 "compare_and_write": false, 00:21:49.353 "abort": true, 00:21:49.353 "seek_hole": false, 00:21:49.353 "seek_data": false, 00:21:49.353 "copy": true, 00:21:49.353 "nvme_iov_md": false 00:21:49.353 }, 00:21:49.353 "memory_domains": [ 00:21:49.353 { 00:21:49.353 "dma_device_id": "system", 00:21:49.353 "dma_device_type": 1 00:21:49.353 }, 00:21:49.353 { 00:21:49.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:49.353 "dma_device_type": 2 00:21:49.353 } 00:21:49.353 ], 00:21:49.353 "driver_specific": {} 00:21:49.353 }' 00:21:49.353 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.353 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:49.354 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:49.354 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.354 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:49.354 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:49.354 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:49.614 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:49.874 [2024-07-12 15:58:10.125382] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:49.874 [2024-07-12 15:58:10.125403] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:49.874 [2024-07-12 15:58:10.125442] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:49.874 [2024-07-12 15:58:10.125653] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:49.874 [2024-07-12 15:58:10.125660] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fefa70 name Existed_Raid, state offline 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2606430 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2606430 ']' 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 2606430 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2606430 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2606430' 00:21:49.874 killing process with pid 2606430 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 2606430 00:21:49.874 [2024-07-12 15:58:10.191127] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:49.874 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 2606430 00:21:49.874 [2024-07-12 15:58:10.211346] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:50.134 15:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:50.134 00:21:50.134 real 0m30.305s 00:21:50.134 user 0m56.874s 00:21:50.134 sys 0m4.352s 00:21:50.134 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:50.134 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:50.134 ************************************ 00:21:50.134 END TEST raid_state_function_test_sb 00:21:50.134 ************************************ 00:21:50.134 15:58:10 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:50.134 15:58:10 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:50.134 15:58:10 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:21:50.134 15:58:10 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:50.134 15:58:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:50.134 ************************************ 00:21:50.134 START TEST raid_superblock_test 00:21:50.134 ************************************ 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2612018 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2612018 /var/tmp/spdk-raid.sock 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 2612018 ']' 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:50.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:50.134 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.134 [2024-07-12 15:58:10.473741] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:21:50.134 [2024-07-12 15:58:10.473794] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2612018 ] 00:21:50.134 [2024-07-12 15:58:10.563080] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:50.394 [2024-07-12 15:58:10.632010] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:50.394 [2024-07-12 15:58:10.681417] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:50.394 [2024-07-12 15:58:10.681441] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:50.963 15:58:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:50.963 15:58:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:50.964 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:51.223 malloc1 00:21:51.223 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:51.482 [2024-07-12 15:58:11.684546] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:51.482 [2024-07-12 15:58:11.684578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.482 [2024-07-12 15:58:11.684590] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf4eb50 00:21:51.482 [2024-07-12 15:58:11.684596] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.482 [2024-07-12 15:58:11.685868] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.482 [2024-07-12 15:58:11.685888] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:51.482 pt1 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:51.482 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:51.483 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:51.483 malloc2 00:21:51.483 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:51.742 [2024-07-12 15:58:12.067475] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:51.742 [2024-07-12 15:58:12.067502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:51.742 [2024-07-12 15:58:12.067512] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf4fdf0 00:21:51.742 [2024-07-12 15:58:12.067518] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:51.742 [2024-07-12 15:58:12.068696] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:51.742 [2024-07-12 15:58:12.068720] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:51.742 pt2 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:51.742 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:52.002 malloc3 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:52.003 [2024-07-12 15:58:12.434281] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:52.003 [2024-07-12 15:58:12.434308] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.003 [2024-07-12 15:58:12.434317] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf4f770 00:21:52.003 [2024-07-12 15:58:12.434328] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.003 [2024-07-12 15:58:12.435498] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.003 [2024-07-12 15:58:12.435517] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:52.003 pt3 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:52.003 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:52.262 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:21:52.262 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:52.262 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:52.262 malloc4 00:21:52.262 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:52.552 [2024-07-12 15:58:12.821259] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:52.552 [2024-07-12 15:58:12.821287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.552 [2024-07-12 15:58:12.821297] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf46840 00:21:52.552 [2024-07-12 15:58:12.821303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.552 [2024-07-12 15:58:12.822475] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.552 [2024-07-12 15:58:12.822494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:52.552 pt4 00:21:52.552 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:21:52.552 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:21:52.552 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:52.811 [2024-07-12 15:58:13.009751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:52.811 [2024-07-12 15:58:13.010752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:52.811 [2024-07-12 15:58:13.010793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:52.811 [2024-07-12 15:58:13.010829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:52.811 [2024-07-12 15:58:13.010967] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11004c0 00:21:52.811 [2024-07-12 15:58:13.010974] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:52.811 [2024-07-12 15:58:13.011121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4f570 00:21:52.811 [2024-07-12 15:58:13.011239] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11004c0 00:21:52.811 [2024-07-12 15:58:13.011245] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11004c0 00:21:52.811 [2024-07-12 15:58:13.011318] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.811 "name": "raid_bdev1", 00:21:52.811 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:21:52.811 "strip_size_kb": 0, 00:21:52.811 "state": "online", 00:21:52.811 "raid_level": "raid1", 00:21:52.811 "superblock": true, 00:21:52.811 "num_base_bdevs": 4, 00:21:52.811 "num_base_bdevs_discovered": 4, 00:21:52.811 "num_base_bdevs_operational": 4, 00:21:52.811 "base_bdevs_list": [ 00:21:52.811 { 00:21:52.811 "name": "pt1", 00:21:52.811 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:52.811 "is_configured": true, 00:21:52.811 "data_offset": 2048, 00:21:52.811 "data_size": 63488 00:21:52.811 }, 00:21:52.811 { 00:21:52.811 "name": "pt2", 00:21:52.811 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:52.811 "is_configured": true, 00:21:52.811 "data_offset": 2048, 00:21:52.811 "data_size": 63488 00:21:52.811 }, 00:21:52.811 { 00:21:52.811 "name": "pt3", 00:21:52.811 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:52.811 "is_configured": true, 00:21:52.811 "data_offset": 2048, 00:21:52.811 "data_size": 63488 00:21:52.811 }, 00:21:52.811 { 00:21:52.811 "name": "pt4", 00:21:52.811 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:52.811 "is_configured": true, 00:21:52.811 "data_offset": 2048, 00:21:52.811 "data_size": 63488 00:21:52.811 } 00:21:52.811 ] 00:21:52.811 }' 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.811 15:58:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:53.379 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:53.639 [2024-07-12 15:58:13.948365] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:53.639 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:53.639 "name": "raid_bdev1", 00:21:53.639 "aliases": [ 00:21:53.639 "2e206601-65ec-44c0-ab48-e81251f2d4c6" 00:21:53.639 ], 00:21:53.639 "product_name": "Raid Volume", 00:21:53.639 "block_size": 512, 00:21:53.639 "num_blocks": 63488, 00:21:53.639 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:21:53.639 "assigned_rate_limits": { 00:21:53.639 "rw_ios_per_sec": 0, 00:21:53.639 "rw_mbytes_per_sec": 0, 00:21:53.639 "r_mbytes_per_sec": 0, 00:21:53.639 "w_mbytes_per_sec": 0 00:21:53.639 }, 00:21:53.639 "claimed": false, 00:21:53.639 "zoned": false, 00:21:53.639 "supported_io_types": { 00:21:53.639 "read": true, 00:21:53.639 "write": true, 00:21:53.639 "unmap": false, 00:21:53.639 "flush": false, 00:21:53.639 "reset": true, 00:21:53.639 "nvme_admin": false, 00:21:53.639 "nvme_io": false, 00:21:53.639 "nvme_io_md": false, 00:21:53.639 "write_zeroes": true, 00:21:53.639 "zcopy": false, 00:21:53.639 "get_zone_info": false, 00:21:53.639 "zone_management": false, 00:21:53.639 "zone_append": false, 00:21:53.639 "compare": false, 00:21:53.639 "compare_and_write": false, 00:21:53.639 "abort": false, 00:21:53.639 "seek_hole": false, 00:21:53.639 "seek_data": false, 00:21:53.639 "copy": false, 00:21:53.639 "nvme_iov_md": false 00:21:53.639 }, 00:21:53.639 "memory_domains": [ 00:21:53.639 { 00:21:53.639 "dma_device_id": "system", 00:21:53.639 "dma_device_type": 1 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.639 "dma_device_type": 2 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "system", 00:21:53.639 "dma_device_type": 1 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.639 "dma_device_type": 2 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "system", 00:21:53.639 "dma_device_type": 1 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.639 "dma_device_type": 2 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "system", 00:21:53.639 "dma_device_type": 1 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.639 "dma_device_type": 2 00:21:53.639 } 00:21:53.639 ], 00:21:53.639 "driver_specific": { 00:21:53.639 "raid": { 00:21:53.639 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:21:53.639 "strip_size_kb": 0, 00:21:53.639 "state": "online", 00:21:53.639 "raid_level": "raid1", 00:21:53.639 "superblock": true, 00:21:53.639 "num_base_bdevs": 4, 00:21:53.639 "num_base_bdevs_discovered": 4, 00:21:53.639 "num_base_bdevs_operational": 4, 00:21:53.639 "base_bdevs_list": [ 00:21:53.639 { 00:21:53.639 "name": "pt1", 00:21:53.639 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:53.639 "is_configured": true, 00:21:53.639 "data_offset": 2048, 00:21:53.639 "data_size": 63488 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "name": "pt2", 00:21:53.639 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.639 "is_configured": true, 00:21:53.639 "data_offset": 2048, 00:21:53.639 "data_size": 63488 00:21:53.639 }, 00:21:53.639 { 00:21:53.639 "name": "pt3", 00:21:53.639 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:53.639 "is_configured": true, 00:21:53.639 "data_offset": 2048, 00:21:53.639 "data_size": 63488 00:21:53.639 }, 00:21:53.639 { 00:21:53.640 "name": "pt4", 00:21:53.640 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:53.640 "is_configured": true, 00:21:53.640 "data_offset": 2048, 00:21:53.640 "data_size": 63488 00:21:53.640 } 00:21:53.640 ] 00:21:53.640 } 00:21:53.640 } 00:21:53.640 }' 00:21:53.640 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:53.640 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:53.640 pt2 00:21:53.640 pt3 00:21:53.640 pt4' 00:21:53.640 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:53.640 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:53.640 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:53.899 "name": "pt1", 00:21:53.899 "aliases": [ 00:21:53.899 "00000000-0000-0000-0000-000000000001" 00:21:53.899 ], 00:21:53.899 "product_name": "passthru", 00:21:53.899 "block_size": 512, 00:21:53.899 "num_blocks": 65536, 00:21:53.899 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:53.899 "assigned_rate_limits": { 00:21:53.899 "rw_ios_per_sec": 0, 00:21:53.899 "rw_mbytes_per_sec": 0, 00:21:53.899 "r_mbytes_per_sec": 0, 00:21:53.899 "w_mbytes_per_sec": 0 00:21:53.899 }, 00:21:53.899 "claimed": true, 00:21:53.899 "claim_type": "exclusive_write", 00:21:53.899 "zoned": false, 00:21:53.899 "supported_io_types": { 00:21:53.899 "read": true, 00:21:53.899 "write": true, 00:21:53.899 "unmap": true, 00:21:53.899 "flush": true, 00:21:53.899 "reset": true, 00:21:53.899 "nvme_admin": false, 00:21:53.899 "nvme_io": false, 00:21:53.899 "nvme_io_md": false, 00:21:53.899 "write_zeroes": true, 00:21:53.899 "zcopy": true, 00:21:53.899 "get_zone_info": false, 00:21:53.899 "zone_management": false, 00:21:53.899 "zone_append": false, 00:21:53.899 "compare": false, 00:21:53.899 "compare_and_write": false, 00:21:53.899 "abort": true, 00:21:53.899 "seek_hole": false, 00:21:53.899 "seek_data": false, 00:21:53.899 "copy": true, 00:21:53.899 "nvme_iov_md": false 00:21:53.899 }, 00:21:53.899 "memory_domains": [ 00:21:53.899 { 00:21:53.899 "dma_device_id": "system", 00:21:53.899 "dma_device_type": 1 00:21:53.899 }, 00:21:53.899 { 00:21:53.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:53.899 "dma_device_type": 2 00:21:53.899 } 00:21:53.899 ], 00:21:53.899 "driver_specific": { 00:21:53.899 "passthru": { 00:21:53.899 "name": "pt1", 00:21:53.899 "base_bdev_name": "malloc1" 00:21:53.899 } 00:21:53.899 } 00:21:53.899 }' 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:53.899 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:54.160 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.420 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.420 "name": "pt2", 00:21:54.420 "aliases": [ 00:21:54.420 "00000000-0000-0000-0000-000000000002" 00:21:54.420 ], 00:21:54.420 "product_name": "passthru", 00:21:54.420 "block_size": 512, 00:21:54.420 "num_blocks": 65536, 00:21:54.420 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:54.420 "assigned_rate_limits": { 00:21:54.420 "rw_ios_per_sec": 0, 00:21:54.420 "rw_mbytes_per_sec": 0, 00:21:54.420 "r_mbytes_per_sec": 0, 00:21:54.420 "w_mbytes_per_sec": 0 00:21:54.420 }, 00:21:54.420 "claimed": true, 00:21:54.420 "claim_type": "exclusive_write", 00:21:54.420 "zoned": false, 00:21:54.420 "supported_io_types": { 00:21:54.420 "read": true, 00:21:54.420 "write": true, 00:21:54.420 "unmap": true, 00:21:54.420 "flush": true, 00:21:54.420 "reset": true, 00:21:54.420 "nvme_admin": false, 00:21:54.420 "nvme_io": false, 00:21:54.420 "nvme_io_md": false, 00:21:54.420 "write_zeroes": true, 00:21:54.420 "zcopy": true, 00:21:54.420 "get_zone_info": false, 00:21:54.420 "zone_management": false, 00:21:54.420 "zone_append": false, 00:21:54.420 "compare": false, 00:21:54.420 "compare_and_write": false, 00:21:54.420 "abort": true, 00:21:54.420 "seek_hole": false, 00:21:54.420 "seek_data": false, 00:21:54.420 "copy": true, 00:21:54.420 "nvme_iov_md": false 00:21:54.420 }, 00:21:54.420 "memory_domains": [ 00:21:54.420 { 00:21:54.420 "dma_device_id": "system", 00:21:54.420 "dma_device_type": 1 00:21:54.420 }, 00:21:54.420 { 00:21:54.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.420 "dma_device_type": 2 00:21:54.420 } 00:21:54.420 ], 00:21:54.420 "driver_specific": { 00:21:54.420 "passthru": { 00:21:54.420 "name": "pt2", 00:21:54.420 "base_bdev_name": "malloc2" 00:21:54.420 } 00:21:54.420 } 00:21:54.420 }' 00:21:54.420 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.420 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.420 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.420 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.679 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:54.679 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:54.679 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.679 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:54.679 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:54.939 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:54.939 "name": "pt3", 00:21:54.939 "aliases": [ 00:21:54.939 "00000000-0000-0000-0000-000000000003" 00:21:54.939 ], 00:21:54.939 "product_name": "passthru", 00:21:54.939 "block_size": 512, 00:21:54.939 "num_blocks": 65536, 00:21:54.939 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:54.939 "assigned_rate_limits": { 00:21:54.939 "rw_ios_per_sec": 0, 00:21:54.939 "rw_mbytes_per_sec": 0, 00:21:54.939 "r_mbytes_per_sec": 0, 00:21:54.939 "w_mbytes_per_sec": 0 00:21:54.939 }, 00:21:54.939 "claimed": true, 00:21:54.939 "claim_type": "exclusive_write", 00:21:54.939 "zoned": false, 00:21:54.939 "supported_io_types": { 00:21:54.939 "read": true, 00:21:54.939 "write": true, 00:21:54.939 "unmap": true, 00:21:54.939 "flush": true, 00:21:54.939 "reset": true, 00:21:54.939 "nvme_admin": false, 00:21:54.939 "nvme_io": false, 00:21:54.939 "nvme_io_md": false, 00:21:54.939 "write_zeroes": true, 00:21:54.939 "zcopy": true, 00:21:54.939 "get_zone_info": false, 00:21:54.939 "zone_management": false, 00:21:54.939 "zone_append": false, 00:21:54.939 "compare": false, 00:21:54.939 "compare_and_write": false, 00:21:54.939 "abort": true, 00:21:54.939 "seek_hole": false, 00:21:54.939 "seek_data": false, 00:21:54.939 "copy": true, 00:21:54.939 "nvme_iov_md": false 00:21:54.939 }, 00:21:54.939 "memory_domains": [ 00:21:54.939 { 00:21:54.939 "dma_device_id": "system", 00:21:54.939 "dma_device_type": 1 00:21:54.939 }, 00:21:54.939 { 00:21:54.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:54.939 "dma_device_type": 2 00:21:54.939 } 00:21:54.939 ], 00:21:54.939 "driver_specific": { 00:21:54.939 "passthru": { 00:21:54.939 "name": "pt3", 00:21:54.939 "base_bdev_name": "malloc3" 00:21:54.939 } 00:21:54.939 } 00:21:54.939 }' 00:21:54.939 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.939 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:54.939 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:54.939 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.198 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:55.458 "name": "pt4", 00:21:55.458 "aliases": [ 00:21:55.458 "00000000-0000-0000-0000-000000000004" 00:21:55.458 ], 00:21:55.458 "product_name": "passthru", 00:21:55.458 "block_size": 512, 00:21:55.458 "num_blocks": 65536, 00:21:55.458 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:55.458 "assigned_rate_limits": { 00:21:55.458 "rw_ios_per_sec": 0, 00:21:55.458 "rw_mbytes_per_sec": 0, 00:21:55.458 "r_mbytes_per_sec": 0, 00:21:55.458 "w_mbytes_per_sec": 0 00:21:55.458 }, 00:21:55.458 "claimed": true, 00:21:55.458 "claim_type": "exclusive_write", 00:21:55.458 "zoned": false, 00:21:55.458 "supported_io_types": { 00:21:55.458 "read": true, 00:21:55.458 "write": true, 00:21:55.458 "unmap": true, 00:21:55.458 "flush": true, 00:21:55.458 "reset": true, 00:21:55.458 "nvme_admin": false, 00:21:55.458 "nvme_io": false, 00:21:55.458 "nvme_io_md": false, 00:21:55.458 "write_zeroes": true, 00:21:55.458 "zcopy": true, 00:21:55.458 "get_zone_info": false, 00:21:55.458 "zone_management": false, 00:21:55.458 "zone_append": false, 00:21:55.458 "compare": false, 00:21:55.458 "compare_and_write": false, 00:21:55.458 "abort": true, 00:21:55.458 "seek_hole": false, 00:21:55.458 "seek_data": false, 00:21:55.458 "copy": true, 00:21:55.458 "nvme_iov_md": false 00:21:55.458 }, 00:21:55.458 "memory_domains": [ 00:21:55.458 { 00:21:55.458 "dma_device_id": "system", 00:21:55.458 "dma_device_type": 1 00:21:55.458 }, 00:21:55.458 { 00:21:55.458 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:55.458 "dma_device_type": 2 00:21:55.458 } 00:21:55.458 ], 00:21:55.458 "driver_specific": { 00:21:55.458 "passthru": { 00:21:55.458 "name": "pt4", 00:21:55.458 "base_bdev_name": "malloc4" 00:21:55.458 } 00:21:55.458 } 00:21:55.458 }' 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.458 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:55.718 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:55.718 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.718 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:55.718 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:55.718 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.718 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:55.718 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:55.718 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:21:55.978 [2024-07-12 15:58:16.390534] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2e206601-65ec-44c0-ab48-e81251f2d4c6 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 2e206601-65ec-44c0-ab48-e81251f2d4c6 ']' 00:21:55.978 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:56.238 [2024-07-12 15:58:16.582780] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:56.238 [2024-07-12 15:58:16.582793] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:56.238 [2024-07-12 15:58:16.582829] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:56.238 [2024-07-12 15:58:16.582888] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:56.238 [2024-07-12 15:58:16.582895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11004c0 name raid_bdev1, state offline 00:21:56.238 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.238 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:21:56.497 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:21:56.497 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:21:56.497 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.497 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:56.756 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.756 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:56.756 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:56.756 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:57.016 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:21:57.016 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:57.275 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:57.275 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:57.534 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:21:57.534 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.534 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:57.535 [2024-07-12 15:58:17.906071] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:57.535 [2024-07-12 15:58:17.907132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:57.535 [2024-07-12 15:58:17.907165] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:57.535 [2024-07-12 15:58:17.907192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:57.535 [2024-07-12 15:58:17.907225] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:57.535 [2024-07-12 15:58:17.907251] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:57.535 [2024-07-12 15:58:17.907265] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:57.535 [2024-07-12 15:58:17.907279] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:57.535 [2024-07-12 15:58:17.907289] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:57.535 [2024-07-12 15:58:17.907295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10f50f0 name raid_bdev1, state configuring 00:21:57.535 request: 00:21:57.535 { 00:21:57.535 "name": "raid_bdev1", 00:21:57.535 "raid_level": "raid1", 00:21:57.535 "base_bdevs": [ 00:21:57.535 "malloc1", 00:21:57.535 "malloc2", 00:21:57.535 "malloc3", 00:21:57.535 "malloc4" 00:21:57.535 ], 00:21:57.535 "superblock": false, 00:21:57.535 "method": "bdev_raid_create", 00:21:57.535 "req_id": 1 00:21:57.535 } 00:21:57.535 Got JSON-RPC error response 00:21:57.535 response: 00:21:57.535 { 00:21:57.535 "code": -17, 00:21:57.535 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:57.535 } 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:57.535 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:21:57.794 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:21:57.794 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:21:57.794 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:58.066 [2024-07-12 15:58:18.274957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:58.066 [2024-07-12 15:58:18.274980] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.066 [2024-07-12 15:58:18.274990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10f7cc0 00:21:58.066 [2024-07-12 15:58:18.274996] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.066 [2024-07-12 15:58:18.276220] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.066 [2024-07-12 15:58:18.276239] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:58.066 [2024-07-12 15:58:18.276282] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:58.066 [2024-07-12 15:58:18.276300] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:58.066 pt1 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.066 "name": "raid_bdev1", 00:21:58.066 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:21:58.066 "strip_size_kb": 0, 00:21:58.066 "state": "configuring", 00:21:58.066 "raid_level": "raid1", 00:21:58.066 "superblock": true, 00:21:58.066 "num_base_bdevs": 4, 00:21:58.066 "num_base_bdevs_discovered": 1, 00:21:58.066 "num_base_bdevs_operational": 4, 00:21:58.066 "base_bdevs_list": [ 00:21:58.066 { 00:21:58.066 "name": "pt1", 00:21:58.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:58.066 "is_configured": true, 00:21:58.066 "data_offset": 2048, 00:21:58.066 "data_size": 63488 00:21:58.066 }, 00:21:58.066 { 00:21:58.066 "name": null, 00:21:58.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:58.066 "is_configured": false, 00:21:58.066 "data_offset": 2048, 00:21:58.066 "data_size": 63488 00:21:58.066 }, 00:21:58.066 { 00:21:58.066 "name": null, 00:21:58.066 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:58.066 "is_configured": false, 00:21:58.066 "data_offset": 2048, 00:21:58.066 "data_size": 63488 00:21:58.066 }, 00:21:58.066 { 00:21:58.066 "name": null, 00:21:58.066 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:58.066 "is_configured": false, 00:21:58.066 "data_offset": 2048, 00:21:58.066 "data_size": 63488 00:21:58.066 } 00:21:58.066 ] 00:21:58.066 }' 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.066 15:58:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:58.705 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:21:58.705 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:58.965 [2024-07-12 15:58:19.233392] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:58.965 [2024-07-12 15:58:19.233421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.965 [2024-07-12 15:58:19.233432] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf478f0 00:21:58.965 [2024-07-12 15:58:19.233439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.965 [2024-07-12 15:58:19.233693] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.965 [2024-07-12 15:58:19.233703] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:58.965 [2024-07-12 15:58:19.233753] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:58.965 [2024-07-12 15:58:19.233766] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:58.965 pt2 00:21:58.965 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:59.224 [2024-07-12 15:58:19.421879] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:59.224 "name": "raid_bdev1", 00:21:59.224 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:21:59.224 "strip_size_kb": 0, 00:21:59.224 "state": "configuring", 00:21:59.224 "raid_level": "raid1", 00:21:59.224 "superblock": true, 00:21:59.224 "num_base_bdevs": 4, 00:21:59.224 "num_base_bdevs_discovered": 1, 00:21:59.224 "num_base_bdevs_operational": 4, 00:21:59.224 "base_bdevs_list": [ 00:21:59.224 { 00:21:59.224 "name": "pt1", 00:21:59.224 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:59.224 "is_configured": true, 00:21:59.224 "data_offset": 2048, 00:21:59.224 "data_size": 63488 00:21:59.224 }, 00:21:59.224 { 00:21:59.224 "name": null, 00:21:59.224 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:59.224 "is_configured": false, 00:21:59.224 "data_offset": 2048, 00:21:59.224 "data_size": 63488 00:21:59.224 }, 00:21:59.224 { 00:21:59.224 "name": null, 00:21:59.224 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:59.224 "is_configured": false, 00:21:59.224 "data_offset": 2048, 00:21:59.224 "data_size": 63488 00:21:59.224 }, 00:21:59.224 { 00:21:59.224 "name": null, 00:21:59.224 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:59.224 "is_configured": false, 00:21:59.224 "data_offset": 2048, 00:21:59.224 "data_size": 63488 00:21:59.224 } 00:21:59.224 ] 00:21:59.224 }' 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:59.224 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.793 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:21:59.793 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:21:59.793 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:00.053 [2024-07-12 15:58:20.368310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:00.053 [2024-07-12 15:58:20.368344] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.053 [2024-07-12 15:58:20.368354] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf47f50 00:22:00.053 [2024-07-12 15:58:20.368361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.053 [2024-07-12 15:58:20.368618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.053 [2024-07-12 15:58:20.368629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:00.053 [2024-07-12 15:58:20.368674] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:00.053 [2024-07-12 15:58:20.368688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:00.053 pt2 00:22:00.053 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.053 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.053 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:00.313 [2024-07-12 15:58:20.556786] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:00.313 [2024-07-12 15:58:20.556810] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.313 [2024-07-12 15:58:20.556818] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10f22f0 00:22:00.313 [2024-07-12 15:58:20.556825] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.313 [2024-07-12 15:58:20.557056] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.313 [2024-07-12 15:58:20.557066] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:00.313 [2024-07-12 15:58:20.557104] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:00.313 [2024-07-12 15:58:20.557115] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:00.313 pt3 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:00.313 [2024-07-12 15:58:20.733227] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:00.313 [2024-07-12 15:58:20.733245] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:00.313 [2024-07-12 15:58:20.733255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10f32e0 00:22:00.313 [2024-07-12 15:58:20.733261] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:00.313 [2024-07-12 15:58:20.733470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:00.313 [2024-07-12 15:58:20.733481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:00.313 [2024-07-12 15:58:20.733512] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:00.313 [2024-07-12 15:58:20.733523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:00.313 [2024-07-12 15:58:20.733615] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf456e0 00:22:00.313 [2024-07-12 15:58:20.733620] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:00.313 [2024-07-12 15:58:20.733760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4dda0 00:22:00.313 [2024-07-12 15:58:20.733867] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf456e0 00:22:00.313 [2024-07-12 15:58:20.733872] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf456e0 00:22:00.313 [2024-07-12 15:58:20.733943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.313 pt4 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.313 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.314 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.573 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.573 "name": "raid_bdev1", 00:22:00.573 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:00.573 "strip_size_kb": 0, 00:22:00.573 "state": "online", 00:22:00.573 "raid_level": "raid1", 00:22:00.573 "superblock": true, 00:22:00.573 "num_base_bdevs": 4, 00:22:00.573 "num_base_bdevs_discovered": 4, 00:22:00.573 "num_base_bdevs_operational": 4, 00:22:00.573 "base_bdevs_list": [ 00:22:00.573 { 00:22:00.573 "name": "pt1", 00:22:00.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:00.573 "is_configured": true, 00:22:00.573 "data_offset": 2048, 00:22:00.573 "data_size": 63488 00:22:00.573 }, 00:22:00.573 { 00:22:00.573 "name": "pt2", 00:22:00.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:00.573 "is_configured": true, 00:22:00.573 "data_offset": 2048, 00:22:00.573 "data_size": 63488 00:22:00.573 }, 00:22:00.573 { 00:22:00.573 "name": "pt3", 00:22:00.573 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:00.573 "is_configured": true, 00:22:00.573 "data_offset": 2048, 00:22:00.573 "data_size": 63488 00:22:00.573 }, 00:22:00.573 { 00:22:00.573 "name": "pt4", 00:22:00.573 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:00.573 "is_configured": true, 00:22:00.573 "data_offset": 2048, 00:22:00.573 "data_size": 63488 00:22:00.573 } 00:22:00.573 ] 00:22:00.573 }' 00:22:00.573 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.573 15:58:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:01.141 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:01.401 [2024-07-12 15:58:21.748055] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:01.401 "name": "raid_bdev1", 00:22:01.401 "aliases": [ 00:22:01.401 "2e206601-65ec-44c0-ab48-e81251f2d4c6" 00:22:01.401 ], 00:22:01.401 "product_name": "Raid Volume", 00:22:01.401 "block_size": 512, 00:22:01.401 "num_blocks": 63488, 00:22:01.401 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:01.401 "assigned_rate_limits": { 00:22:01.401 "rw_ios_per_sec": 0, 00:22:01.401 "rw_mbytes_per_sec": 0, 00:22:01.401 "r_mbytes_per_sec": 0, 00:22:01.401 "w_mbytes_per_sec": 0 00:22:01.401 }, 00:22:01.401 "claimed": false, 00:22:01.401 "zoned": false, 00:22:01.401 "supported_io_types": { 00:22:01.401 "read": true, 00:22:01.401 "write": true, 00:22:01.401 "unmap": false, 00:22:01.401 "flush": false, 00:22:01.401 "reset": true, 00:22:01.401 "nvme_admin": false, 00:22:01.401 "nvme_io": false, 00:22:01.401 "nvme_io_md": false, 00:22:01.401 "write_zeroes": true, 00:22:01.401 "zcopy": false, 00:22:01.401 "get_zone_info": false, 00:22:01.401 "zone_management": false, 00:22:01.401 "zone_append": false, 00:22:01.401 "compare": false, 00:22:01.401 "compare_and_write": false, 00:22:01.401 "abort": false, 00:22:01.401 "seek_hole": false, 00:22:01.401 "seek_data": false, 00:22:01.401 "copy": false, 00:22:01.401 "nvme_iov_md": false 00:22:01.401 }, 00:22:01.401 "memory_domains": [ 00:22:01.401 { 00:22:01.401 "dma_device_id": "system", 00:22:01.401 "dma_device_type": 1 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.401 "dma_device_type": 2 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "system", 00:22:01.401 "dma_device_type": 1 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.401 "dma_device_type": 2 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "system", 00:22:01.401 "dma_device_type": 1 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.401 "dma_device_type": 2 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "system", 00:22:01.401 "dma_device_type": 1 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.401 "dma_device_type": 2 00:22:01.401 } 00:22:01.401 ], 00:22:01.401 "driver_specific": { 00:22:01.401 "raid": { 00:22:01.401 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:01.401 "strip_size_kb": 0, 00:22:01.401 "state": "online", 00:22:01.401 "raid_level": "raid1", 00:22:01.401 "superblock": true, 00:22:01.401 "num_base_bdevs": 4, 00:22:01.401 "num_base_bdevs_discovered": 4, 00:22:01.401 "num_base_bdevs_operational": 4, 00:22:01.401 "base_bdevs_list": [ 00:22:01.401 { 00:22:01.401 "name": "pt1", 00:22:01.401 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:01.401 "is_configured": true, 00:22:01.401 "data_offset": 2048, 00:22:01.401 "data_size": 63488 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "name": "pt2", 00:22:01.401 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:01.401 "is_configured": true, 00:22:01.401 "data_offset": 2048, 00:22:01.401 "data_size": 63488 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "name": "pt3", 00:22:01.401 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:01.401 "is_configured": true, 00:22:01.401 "data_offset": 2048, 00:22:01.401 "data_size": 63488 00:22:01.401 }, 00:22:01.401 { 00:22:01.401 "name": "pt4", 00:22:01.401 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:01.401 "is_configured": true, 00:22:01.401 "data_offset": 2048, 00:22:01.401 "data_size": 63488 00:22:01.401 } 00:22:01.401 ] 00:22:01.401 } 00:22:01.401 } 00:22:01.401 }' 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:01.401 pt2 00:22:01.401 pt3 00:22:01.401 pt4' 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:01.401 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:01.661 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:01.661 "name": "pt1", 00:22:01.661 "aliases": [ 00:22:01.661 "00000000-0000-0000-0000-000000000001" 00:22:01.661 ], 00:22:01.661 "product_name": "passthru", 00:22:01.661 "block_size": 512, 00:22:01.661 "num_blocks": 65536, 00:22:01.661 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:01.661 "assigned_rate_limits": { 00:22:01.661 "rw_ios_per_sec": 0, 00:22:01.661 "rw_mbytes_per_sec": 0, 00:22:01.661 "r_mbytes_per_sec": 0, 00:22:01.661 "w_mbytes_per_sec": 0 00:22:01.661 }, 00:22:01.661 "claimed": true, 00:22:01.661 "claim_type": "exclusive_write", 00:22:01.661 "zoned": false, 00:22:01.661 "supported_io_types": { 00:22:01.661 "read": true, 00:22:01.661 "write": true, 00:22:01.661 "unmap": true, 00:22:01.661 "flush": true, 00:22:01.661 "reset": true, 00:22:01.661 "nvme_admin": false, 00:22:01.661 "nvme_io": false, 00:22:01.661 "nvme_io_md": false, 00:22:01.661 "write_zeroes": true, 00:22:01.661 "zcopy": true, 00:22:01.661 "get_zone_info": false, 00:22:01.661 "zone_management": false, 00:22:01.661 "zone_append": false, 00:22:01.661 "compare": false, 00:22:01.661 "compare_and_write": false, 00:22:01.661 "abort": true, 00:22:01.661 "seek_hole": false, 00:22:01.661 "seek_data": false, 00:22:01.661 "copy": true, 00:22:01.661 "nvme_iov_md": false 00:22:01.661 }, 00:22:01.661 "memory_domains": [ 00:22:01.661 { 00:22:01.661 "dma_device_id": "system", 00:22:01.661 "dma_device_type": 1 00:22:01.661 }, 00:22:01.661 { 00:22:01.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:01.661 "dma_device_type": 2 00:22:01.661 } 00:22:01.661 ], 00:22:01.661 "driver_specific": { 00:22:01.661 "passthru": { 00:22:01.661 "name": "pt1", 00:22:01.661 "base_bdev_name": "malloc1" 00:22:01.661 } 00:22:01.661 } 00:22:01.661 }' 00:22:01.661 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.661 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:01.661 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:01.661 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:01.921 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.180 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.180 "name": "pt2", 00:22:02.181 "aliases": [ 00:22:02.181 "00000000-0000-0000-0000-000000000002" 00:22:02.181 ], 00:22:02.181 "product_name": "passthru", 00:22:02.181 "block_size": 512, 00:22:02.181 "num_blocks": 65536, 00:22:02.181 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:02.181 "assigned_rate_limits": { 00:22:02.181 "rw_ios_per_sec": 0, 00:22:02.181 "rw_mbytes_per_sec": 0, 00:22:02.181 "r_mbytes_per_sec": 0, 00:22:02.181 "w_mbytes_per_sec": 0 00:22:02.181 }, 00:22:02.181 "claimed": true, 00:22:02.181 "claim_type": "exclusive_write", 00:22:02.181 "zoned": false, 00:22:02.181 "supported_io_types": { 00:22:02.181 "read": true, 00:22:02.181 "write": true, 00:22:02.181 "unmap": true, 00:22:02.181 "flush": true, 00:22:02.181 "reset": true, 00:22:02.181 "nvme_admin": false, 00:22:02.181 "nvme_io": false, 00:22:02.181 "nvme_io_md": false, 00:22:02.181 "write_zeroes": true, 00:22:02.181 "zcopy": true, 00:22:02.181 "get_zone_info": false, 00:22:02.181 "zone_management": false, 00:22:02.181 "zone_append": false, 00:22:02.181 "compare": false, 00:22:02.181 "compare_and_write": false, 00:22:02.181 "abort": true, 00:22:02.181 "seek_hole": false, 00:22:02.181 "seek_data": false, 00:22:02.181 "copy": true, 00:22:02.181 "nvme_iov_md": false 00:22:02.181 }, 00:22:02.181 "memory_domains": [ 00:22:02.181 { 00:22:02.181 "dma_device_id": "system", 00:22:02.181 "dma_device_type": 1 00:22:02.181 }, 00:22:02.181 { 00:22:02.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.181 "dma_device_type": 2 00:22:02.181 } 00:22:02.181 ], 00:22:02.181 "driver_specific": { 00:22:02.181 "passthru": { 00:22:02.181 "name": "pt2", 00:22:02.181 "base_bdev_name": "malloc2" 00:22:02.181 } 00:22:02.181 } 00:22:02.181 }' 00:22:02.181 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.181 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.181 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.181 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:02.440 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:02.700 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:02.700 "name": "pt3", 00:22:02.700 "aliases": [ 00:22:02.700 "00000000-0000-0000-0000-000000000003" 00:22:02.700 ], 00:22:02.700 "product_name": "passthru", 00:22:02.700 "block_size": 512, 00:22:02.700 "num_blocks": 65536, 00:22:02.700 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:02.700 "assigned_rate_limits": { 00:22:02.700 "rw_ios_per_sec": 0, 00:22:02.700 "rw_mbytes_per_sec": 0, 00:22:02.700 "r_mbytes_per_sec": 0, 00:22:02.700 "w_mbytes_per_sec": 0 00:22:02.700 }, 00:22:02.700 "claimed": true, 00:22:02.700 "claim_type": "exclusive_write", 00:22:02.700 "zoned": false, 00:22:02.700 "supported_io_types": { 00:22:02.700 "read": true, 00:22:02.700 "write": true, 00:22:02.700 "unmap": true, 00:22:02.700 "flush": true, 00:22:02.700 "reset": true, 00:22:02.700 "nvme_admin": false, 00:22:02.700 "nvme_io": false, 00:22:02.700 "nvme_io_md": false, 00:22:02.700 "write_zeroes": true, 00:22:02.700 "zcopy": true, 00:22:02.700 "get_zone_info": false, 00:22:02.700 "zone_management": false, 00:22:02.700 "zone_append": false, 00:22:02.700 "compare": false, 00:22:02.700 "compare_and_write": false, 00:22:02.700 "abort": true, 00:22:02.700 "seek_hole": false, 00:22:02.700 "seek_data": false, 00:22:02.700 "copy": true, 00:22:02.700 "nvme_iov_md": false 00:22:02.700 }, 00:22:02.700 "memory_domains": [ 00:22:02.700 { 00:22:02.700 "dma_device_id": "system", 00:22:02.700 "dma_device_type": 1 00:22:02.700 }, 00:22:02.700 { 00:22:02.700 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:02.700 "dma_device_type": 2 00:22:02.700 } 00:22:02.700 ], 00:22:02.700 "driver_specific": { 00:22:02.700 "passthru": { 00:22:02.700 "name": "pt3", 00:22:02.700 "base_bdev_name": "malloc3" 00:22:02.700 } 00:22:02.700 } 00:22:02.700 }' 00:22:02.700 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.700 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:02.700 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:02.700 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:02.960 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:03.529 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:03.529 "name": "pt4", 00:22:03.529 "aliases": [ 00:22:03.529 "00000000-0000-0000-0000-000000000004" 00:22:03.529 ], 00:22:03.529 "product_name": "passthru", 00:22:03.529 "block_size": 512, 00:22:03.529 "num_blocks": 65536, 00:22:03.529 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:03.529 "assigned_rate_limits": { 00:22:03.529 "rw_ios_per_sec": 0, 00:22:03.529 "rw_mbytes_per_sec": 0, 00:22:03.529 "r_mbytes_per_sec": 0, 00:22:03.529 "w_mbytes_per_sec": 0 00:22:03.529 }, 00:22:03.529 "claimed": true, 00:22:03.529 "claim_type": "exclusive_write", 00:22:03.529 "zoned": false, 00:22:03.529 "supported_io_types": { 00:22:03.529 "read": true, 00:22:03.529 "write": true, 00:22:03.529 "unmap": true, 00:22:03.529 "flush": true, 00:22:03.529 "reset": true, 00:22:03.529 "nvme_admin": false, 00:22:03.529 "nvme_io": false, 00:22:03.529 "nvme_io_md": false, 00:22:03.529 "write_zeroes": true, 00:22:03.529 "zcopy": true, 00:22:03.529 "get_zone_info": false, 00:22:03.529 "zone_management": false, 00:22:03.529 "zone_append": false, 00:22:03.529 "compare": false, 00:22:03.529 "compare_and_write": false, 00:22:03.529 "abort": true, 00:22:03.529 "seek_hole": false, 00:22:03.529 "seek_data": false, 00:22:03.529 "copy": true, 00:22:03.529 "nvme_iov_md": false 00:22:03.529 }, 00:22:03.529 "memory_domains": [ 00:22:03.529 { 00:22:03.529 "dma_device_id": "system", 00:22:03.529 "dma_device_type": 1 00:22:03.529 }, 00:22:03.529 { 00:22:03.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:03.529 "dma_device_type": 2 00:22:03.529 } 00:22:03.529 ], 00:22:03.529 "driver_specific": { 00:22:03.529 "passthru": { 00:22:03.529 "name": "pt4", 00:22:03.529 "base_bdev_name": "malloc4" 00:22:03.529 } 00:22:03.529 } 00:22:03.529 }' 00:22:03.529 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.788 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:03.788 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.048 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:04.048 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:04.049 [2024-07-12 15:58:24.462916] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 2e206601-65ec-44c0-ab48-e81251f2d4c6 '!=' 2e206601-65ec-44c0-ab48-e81251f2d4c6 ']' 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:04.049 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:04.308 [2024-07-12 15:58:24.659184] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:22:04.308 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:04.308 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.308 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.308 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.308 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.309 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.568 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.568 "name": "raid_bdev1", 00:22:04.568 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:04.568 "strip_size_kb": 0, 00:22:04.568 "state": "online", 00:22:04.568 "raid_level": "raid1", 00:22:04.568 "superblock": true, 00:22:04.568 "num_base_bdevs": 4, 00:22:04.568 "num_base_bdevs_discovered": 3, 00:22:04.568 "num_base_bdevs_operational": 3, 00:22:04.568 "base_bdevs_list": [ 00:22:04.568 { 00:22:04.568 "name": null, 00:22:04.568 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:04.568 "is_configured": false, 00:22:04.568 "data_offset": 2048, 00:22:04.568 "data_size": 63488 00:22:04.568 }, 00:22:04.568 { 00:22:04.568 "name": "pt2", 00:22:04.568 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:04.568 "is_configured": true, 00:22:04.568 "data_offset": 2048, 00:22:04.568 "data_size": 63488 00:22:04.568 }, 00:22:04.568 { 00:22:04.568 "name": "pt3", 00:22:04.568 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:04.568 "is_configured": true, 00:22:04.568 "data_offset": 2048, 00:22:04.568 "data_size": 63488 00:22:04.568 }, 00:22:04.568 { 00:22:04.568 "name": "pt4", 00:22:04.568 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:04.568 "is_configured": true, 00:22:04.568 "data_offset": 2048, 00:22:04.568 "data_size": 63488 00:22:04.568 } 00:22:04.568 ] 00:22:04.568 }' 00:22:04.568 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.568 15:58:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.158 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:05.158 [2024-07-12 15:58:25.597538] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:05.158 [2024-07-12 15:58:25.597555] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:05.158 [2024-07-12 15:58:25.597589] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:05.158 [2024-07-12 15:58:25.597640] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:05.158 [2024-07-12 15:58:25.597646] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf456e0 name raid_bdev1, state offline 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:05.416 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:05.675 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:05.675 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:05.675 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:05.934 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:06.193 [2024-07-12 15:58:26.547912] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:06.193 [2024-07-12 15:58:26.547940] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:06.193 [2024-07-12 15:58:26.547951] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf46f70 00:22:06.193 [2024-07-12 15:58:26.547957] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:06.193 [2024-07-12 15:58:26.549218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:06.193 [2024-07-12 15:58:26.549238] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:06.193 [2024-07-12 15:58:26.549289] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:06.193 [2024-07-12 15:58:26.549308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:06.193 pt2 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.193 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.452 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.452 "name": "raid_bdev1", 00:22:06.452 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:06.452 "strip_size_kb": 0, 00:22:06.452 "state": "configuring", 00:22:06.452 "raid_level": "raid1", 00:22:06.452 "superblock": true, 00:22:06.452 "num_base_bdevs": 4, 00:22:06.452 "num_base_bdevs_discovered": 1, 00:22:06.452 "num_base_bdevs_operational": 3, 00:22:06.452 "base_bdevs_list": [ 00:22:06.452 { 00:22:06.452 "name": null, 00:22:06.452 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.452 "is_configured": false, 00:22:06.452 "data_offset": 2048, 00:22:06.452 "data_size": 63488 00:22:06.452 }, 00:22:06.452 { 00:22:06.452 "name": "pt2", 00:22:06.452 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:06.452 "is_configured": true, 00:22:06.452 "data_offset": 2048, 00:22:06.452 "data_size": 63488 00:22:06.452 }, 00:22:06.452 { 00:22:06.452 "name": null, 00:22:06.452 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:06.452 "is_configured": false, 00:22:06.452 "data_offset": 2048, 00:22:06.452 "data_size": 63488 00:22:06.452 }, 00:22:06.452 { 00:22:06.452 "name": null, 00:22:06.452 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:06.452 "is_configured": false, 00:22:06.452 "data_offset": 2048, 00:22:06.452 "data_size": 63488 00:22:06.452 } 00:22:06.452 ] 00:22:06.452 }' 00:22:06.452 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.452 15:58:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:07.021 [2024-07-12 15:58:27.446175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:07.021 [2024-07-12 15:58:27.446202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:07.021 [2024-07-12 15:58:27.446211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10f7520 00:22:07.021 [2024-07-12 15:58:27.446217] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:07.021 [2024-07-12 15:58:27.446468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:07.021 [2024-07-12 15:58:27.446479] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:07.021 [2024-07-12 15:58:27.446520] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:07.021 [2024-07-12 15:58:27.446532] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:07.021 pt3 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.021 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:07.281 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.281 "name": "raid_bdev1", 00:22:07.281 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:07.281 "strip_size_kb": 0, 00:22:07.281 "state": "configuring", 00:22:07.281 "raid_level": "raid1", 00:22:07.281 "superblock": true, 00:22:07.281 "num_base_bdevs": 4, 00:22:07.281 "num_base_bdevs_discovered": 2, 00:22:07.281 "num_base_bdevs_operational": 3, 00:22:07.281 "base_bdevs_list": [ 00:22:07.281 { 00:22:07.281 "name": null, 00:22:07.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:07.281 "is_configured": false, 00:22:07.281 "data_offset": 2048, 00:22:07.281 "data_size": 63488 00:22:07.281 }, 00:22:07.281 { 00:22:07.281 "name": "pt2", 00:22:07.281 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:07.281 "is_configured": true, 00:22:07.281 "data_offset": 2048, 00:22:07.281 "data_size": 63488 00:22:07.281 }, 00:22:07.281 { 00:22:07.281 "name": "pt3", 00:22:07.281 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:07.281 "is_configured": true, 00:22:07.281 "data_offset": 2048, 00:22:07.281 "data_size": 63488 00:22:07.281 }, 00:22:07.281 { 00:22:07.281 "name": null, 00:22:07.281 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:07.281 "is_configured": false, 00:22:07.281 "data_offset": 2048, 00:22:07.281 "data_size": 63488 00:22:07.281 } 00:22:07.281 ] 00:22:07.281 }' 00:22:07.281 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.281 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.849 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:22:07.849 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:22:07.849 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:22:07.849 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:08.109 [2024-07-12 15:58:28.376643] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:08.109 [2024-07-12 15:58:28.376671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.109 [2024-07-12 15:58:28.376681] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10f22f0 00:22:08.109 [2024-07-12 15:58:28.376687] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.109 [2024-07-12 15:58:28.376940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.109 [2024-07-12 15:58:28.376951] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:08.109 [2024-07-12 15:58:28.376992] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:08.109 [2024-07-12 15:58:28.377005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:08.109 [2024-07-12 15:58:28.377089] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf4dc30 00:22:08.109 [2024-07-12 15:58:28.377095] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:08.109 [2024-07-12 15:58:28.377239] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4d590 00:22:08.109 [2024-07-12 15:58:28.377342] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf4dc30 00:22:08.109 [2024-07-12 15:58:28.377347] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf4dc30 00:22:08.109 [2024-07-12 15:58:28.377417] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:08.109 pt4 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.109 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:08.368 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:08.368 "name": "raid_bdev1", 00:22:08.368 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:08.368 "strip_size_kb": 0, 00:22:08.368 "state": "online", 00:22:08.368 "raid_level": "raid1", 00:22:08.368 "superblock": true, 00:22:08.368 "num_base_bdevs": 4, 00:22:08.368 "num_base_bdevs_discovered": 3, 00:22:08.368 "num_base_bdevs_operational": 3, 00:22:08.368 "base_bdevs_list": [ 00:22:08.368 { 00:22:08.368 "name": null, 00:22:08.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:08.368 "is_configured": false, 00:22:08.368 "data_offset": 2048, 00:22:08.368 "data_size": 63488 00:22:08.368 }, 00:22:08.368 { 00:22:08.368 "name": "pt2", 00:22:08.368 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:08.368 "is_configured": true, 00:22:08.368 "data_offset": 2048, 00:22:08.368 "data_size": 63488 00:22:08.368 }, 00:22:08.368 { 00:22:08.368 "name": "pt3", 00:22:08.368 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:08.368 "is_configured": true, 00:22:08.368 "data_offset": 2048, 00:22:08.368 "data_size": 63488 00:22:08.368 }, 00:22:08.368 { 00:22:08.368 "name": "pt4", 00:22:08.368 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:08.368 "is_configured": true, 00:22:08.368 "data_offset": 2048, 00:22:08.368 "data_size": 63488 00:22:08.368 } 00:22:08.368 ] 00:22:08.368 }' 00:22:08.368 15:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:08.368 15:58:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:08.936 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:08.936 [2024-07-12 15:58:29.315015] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:08.936 [2024-07-12 15:58:29.315030] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:08.936 [2024-07-12 15:58:29.315066] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:08.936 [2024-07-12 15:58:29.315115] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:08.936 [2024-07-12 15:58:29.315120] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf4dc30 name raid_bdev1, state offline 00:22:08.936 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.936 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:22:09.194 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:22:09.194 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:22:09.194 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:22:09.194 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:22:09.194 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:09.453 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:09.453 [2024-07-12 15:58:29.884424] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:09.453 [2024-07-12 15:58:29.884451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.453 [2024-07-12 15:58:29.884460] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf4ee50 00:22:09.453 [2024-07-12 15:58:29.884466] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.453 [2024-07-12 15:58:29.885723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.453 [2024-07-12 15:58:29.885743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:09.453 [2024-07-12 15:58:29.885786] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:09.453 [2024-07-12 15:58:29.885806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:09.453 [2024-07-12 15:58:29.885876] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:22:09.453 [2024-07-12 15:58:29.885883] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:09.453 [2024-07-12 15:58:29.885892] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf45b00 name raid_bdev1, state configuring 00:22:09.453 [2024-07-12 15:58:29.885906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:09.453 [2024-07-12 15:58:29.885962] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:09.453 pt1 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.712 15:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.712 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:09.712 "name": "raid_bdev1", 00:22:09.712 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:09.712 "strip_size_kb": 0, 00:22:09.712 "state": "configuring", 00:22:09.712 "raid_level": "raid1", 00:22:09.712 "superblock": true, 00:22:09.712 "num_base_bdevs": 4, 00:22:09.712 "num_base_bdevs_discovered": 2, 00:22:09.712 "num_base_bdevs_operational": 3, 00:22:09.712 "base_bdevs_list": [ 00:22:09.712 { 00:22:09.712 "name": null, 00:22:09.712 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:09.712 "is_configured": false, 00:22:09.712 "data_offset": 2048, 00:22:09.712 "data_size": 63488 00:22:09.712 }, 00:22:09.712 { 00:22:09.712 "name": "pt2", 00:22:09.712 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:09.712 "is_configured": true, 00:22:09.712 "data_offset": 2048, 00:22:09.712 "data_size": 63488 00:22:09.712 }, 00:22:09.712 { 00:22:09.712 "name": "pt3", 00:22:09.712 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:09.712 "is_configured": true, 00:22:09.712 "data_offset": 2048, 00:22:09.712 "data_size": 63488 00:22:09.712 }, 00:22:09.712 { 00:22:09.712 "name": null, 00:22:09.712 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:09.712 "is_configured": false, 00:22:09.712 "data_offset": 2048, 00:22:09.712 "data_size": 63488 00:22:09.712 } 00:22:09.712 ] 00:22:09.712 }' 00:22:09.712 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:09.712 15:58:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.281 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:22:10.281 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:10.540 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:22:10.540 15:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:10.799 [2024-07-12 15:58:31.055388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:10.799 [2024-07-12 15:58:31.055422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.799 [2024-07-12 15:58:31.055434] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf47b20 00:22:10.799 [2024-07-12 15:58:31.055440] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.799 [2024-07-12 15:58:31.055707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.799 [2024-07-12 15:58:31.055726] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:10.799 [2024-07-12 15:58:31.055768] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:10.799 [2024-07-12 15:58:31.055781] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:10.799 [2024-07-12 15:58:31.055870] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf471a0 00:22:10.799 [2024-07-12 15:58:31.055877] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:10.799 [2024-07-12 15:58:31.056012] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf4f0e0 00:22:10.799 [2024-07-12 15:58:31.056112] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf471a0 00:22:10.799 [2024-07-12 15:58:31.056118] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf471a0 00:22:10.799 [2024-07-12 15:58:31.056187] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.799 pt4 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.799 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.059 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.059 "name": "raid_bdev1", 00:22:11.059 "uuid": "2e206601-65ec-44c0-ab48-e81251f2d4c6", 00:22:11.059 "strip_size_kb": 0, 00:22:11.059 "state": "online", 00:22:11.059 "raid_level": "raid1", 00:22:11.059 "superblock": true, 00:22:11.059 "num_base_bdevs": 4, 00:22:11.059 "num_base_bdevs_discovered": 3, 00:22:11.059 "num_base_bdevs_operational": 3, 00:22:11.059 "base_bdevs_list": [ 00:22:11.059 { 00:22:11.059 "name": null, 00:22:11.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.059 "is_configured": false, 00:22:11.059 "data_offset": 2048, 00:22:11.059 "data_size": 63488 00:22:11.059 }, 00:22:11.059 { 00:22:11.059 "name": "pt2", 00:22:11.059 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.059 "is_configured": true, 00:22:11.059 "data_offset": 2048, 00:22:11.059 "data_size": 63488 00:22:11.059 }, 00:22:11.059 { 00:22:11.059 "name": "pt3", 00:22:11.059 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:11.059 "is_configured": true, 00:22:11.059 "data_offset": 2048, 00:22:11.059 "data_size": 63488 00:22:11.059 }, 00:22:11.059 { 00:22:11.059 "name": "pt4", 00:22:11.059 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:11.059 "is_configured": true, 00:22:11.059 "data_offset": 2048, 00:22:11.059 "data_size": 63488 00:22:11.059 } 00:22:11.059 ] 00:22:11.059 }' 00:22:11.059 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.059 15:58:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:11.627 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:22:11.627 15:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:22:11.627 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:22:11.627 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:11.627 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:22:11.887 [2024-07-12 15:58:32.194513] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2e206601-65ec-44c0-ab48-e81251f2d4c6 '!=' 2e206601-65ec-44c0-ab48-e81251f2d4c6 ']' 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2612018 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 2612018 ']' 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 2612018 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2612018 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2612018' 00:22:11.887 killing process with pid 2612018 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 2612018 00:22:11.887 [2024-07-12 15:58:32.249852] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:11.887 [2024-07-12 15:58:32.249890] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:11.887 [2024-07-12 15:58:32.249938] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:11.887 [2024-07-12 15:58:32.249944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf471a0 name raid_bdev1, state offline 00:22:11.887 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 2612018 00:22:11.887 [2024-07-12 15:58:32.270532] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:12.147 15:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:12.147 00:22:12.147 real 0m21.971s 00:22:12.147 user 0m41.148s 00:22:12.147 sys 0m3.246s 00:22:12.147 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:12.147 15:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.147 ************************************ 00:22:12.147 END TEST raid_superblock_test 00:22:12.147 ************************************ 00:22:12.147 15:58:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:12.147 15:58:32 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:22:12.147 15:58:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:12.147 15:58:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:12.147 15:58:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:12.147 ************************************ 00:22:12.147 START TEST raid_read_error_test 00:22:12.147 ************************************ 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FFodchzXkx 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2616200 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2616200 /var/tmp/spdk-raid.sock 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 2616200 ']' 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:12.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:12.147 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:12.147 [2024-07-12 15:58:32.543424] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:22:12.147 [2024-07-12 15:58:32.543482] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2616200 ] 00:22:12.407 [2024-07-12 15:58:32.634898] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:12.407 [2024-07-12 15:58:32.702802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:12.407 [2024-07-12 15:58:32.742784] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:12.407 [2024-07-12 15:58:32.742806] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:12.979 15:58:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:12.979 15:58:33 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:12.979 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:12.979 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:13.238 BaseBdev1_malloc 00:22:13.238 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:13.497 true 00:22:13.497 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:13.497 [2024-07-12 15:58:33.885147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:13.497 [2024-07-12 15:58:33.885176] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.497 [2024-07-12 15:58:33.885186] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1dacaa0 00:22:13.497 [2024-07-12 15:58:33.885193] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.497 [2024-07-12 15:58:33.886419] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.497 [2024-07-12 15:58:33.886438] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:13.497 BaseBdev1 00:22:13.497 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:13.497 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:13.756 BaseBdev2_malloc 00:22:13.756 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:14.015 true 00:22:14.015 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:14.015 [2024-07-12 15:58:34.428364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:14.015 [2024-07-12 15:58:34.428391] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:14.015 [2024-07-12 15:58:34.428403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db1e40 00:22:14.015 [2024-07-12 15:58:34.428409] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:14.015 [2024-07-12 15:58:34.429599] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:14.015 [2024-07-12 15:58:34.429618] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:14.015 BaseBdev2 00:22:14.015 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:14.015 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:14.274 BaseBdev3_malloc 00:22:14.274 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:14.533 true 00:22:14.533 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:14.792 [2024-07-12 15:58:34.995788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:14.792 [2024-07-12 15:58:34.995815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:14.792 [2024-07-12 15:58:34.995827] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db37f0 00:22:14.792 [2024-07-12 15:58:34.995834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:14.792 [2024-07-12 15:58:34.997026] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:14.792 [2024-07-12 15:58:34.997045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:14.792 BaseBdev3 00:22:14.792 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:14.792 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:14.792 BaseBdev4_malloc 00:22:14.792 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:15.073 true 00:22:15.073 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:15.332 [2024-07-12 15:58:35.539114] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:15.332 [2024-07-12 15:58:35.539144] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:15.332 [2024-07-12 15:58:35.539157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1db18b0 00:22:15.332 [2024-07-12 15:58:35.539163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:15.332 [2024-07-12 15:58:35.540341] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:15.332 [2024-07-12 15:58:35.540360] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:15.332 BaseBdev4 00:22:15.332 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:15.332 [2024-07-12 15:58:35.727611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:15.332 [2024-07-12 15:58:35.728611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:15.332 [2024-07-12 15:58:35.728664] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:15.332 [2024-07-12 15:58:35.728716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:15.333 [2024-07-12 15:58:35.728896] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1db5290 00:22:15.333 [2024-07-12 15:58:35.728904] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:15.333 [2024-07-12 15:58:35.729049] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db57e0 00:22:15.333 [2024-07-12 15:58:35.729172] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1db5290 00:22:15.333 [2024-07-12 15:58:35.729178] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1db5290 00:22:15.333 [2024-07-12 15:58:35.729253] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:15.333 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:15.591 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:15.591 "name": "raid_bdev1", 00:22:15.591 "uuid": "f1285e3a-5550-4117-b1c4-b94ddbe39873", 00:22:15.591 "strip_size_kb": 0, 00:22:15.592 "state": "online", 00:22:15.592 "raid_level": "raid1", 00:22:15.592 "superblock": true, 00:22:15.592 "num_base_bdevs": 4, 00:22:15.592 "num_base_bdevs_discovered": 4, 00:22:15.592 "num_base_bdevs_operational": 4, 00:22:15.592 "base_bdevs_list": [ 00:22:15.592 { 00:22:15.592 "name": "BaseBdev1", 00:22:15.592 "uuid": "f474c2db-6406-5bf4-82b6-9d8f4613e223", 00:22:15.592 "is_configured": true, 00:22:15.592 "data_offset": 2048, 00:22:15.592 "data_size": 63488 00:22:15.592 }, 00:22:15.592 { 00:22:15.592 "name": "BaseBdev2", 00:22:15.592 "uuid": "51eb004b-1b68-5d21-b502-bc4c168892c7", 00:22:15.592 "is_configured": true, 00:22:15.592 "data_offset": 2048, 00:22:15.592 "data_size": 63488 00:22:15.592 }, 00:22:15.592 { 00:22:15.592 "name": "BaseBdev3", 00:22:15.592 "uuid": "f0977f62-688c-5100-8cd4-e7a28f2f3402", 00:22:15.592 "is_configured": true, 00:22:15.592 "data_offset": 2048, 00:22:15.592 "data_size": 63488 00:22:15.592 }, 00:22:15.592 { 00:22:15.592 "name": "BaseBdev4", 00:22:15.592 "uuid": "1c1351a0-edab-51f1-966f-d2863b0a9948", 00:22:15.592 "is_configured": true, 00:22:15.592 "data_offset": 2048, 00:22:15.592 "data_size": 63488 00:22:15.592 } 00:22:15.592 ] 00:22:15.592 }' 00:22:15.592 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:15.592 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:16.161 15:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:16.161 15:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:16.161 [2024-07-12 15:58:36.521835] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c08990 00:22:17.100 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.360 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.652 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.652 "name": "raid_bdev1", 00:22:17.652 "uuid": "f1285e3a-5550-4117-b1c4-b94ddbe39873", 00:22:17.652 "strip_size_kb": 0, 00:22:17.652 "state": "online", 00:22:17.652 "raid_level": "raid1", 00:22:17.652 "superblock": true, 00:22:17.652 "num_base_bdevs": 4, 00:22:17.652 "num_base_bdevs_discovered": 4, 00:22:17.652 "num_base_bdevs_operational": 4, 00:22:17.652 "base_bdevs_list": [ 00:22:17.652 { 00:22:17.652 "name": "BaseBdev1", 00:22:17.652 "uuid": "f474c2db-6406-5bf4-82b6-9d8f4613e223", 00:22:17.652 "is_configured": true, 00:22:17.652 "data_offset": 2048, 00:22:17.652 "data_size": 63488 00:22:17.652 }, 00:22:17.652 { 00:22:17.652 "name": "BaseBdev2", 00:22:17.652 "uuid": "51eb004b-1b68-5d21-b502-bc4c168892c7", 00:22:17.652 "is_configured": true, 00:22:17.652 "data_offset": 2048, 00:22:17.652 "data_size": 63488 00:22:17.652 }, 00:22:17.652 { 00:22:17.652 "name": "BaseBdev3", 00:22:17.652 "uuid": "f0977f62-688c-5100-8cd4-e7a28f2f3402", 00:22:17.652 "is_configured": true, 00:22:17.652 "data_offset": 2048, 00:22:17.652 "data_size": 63488 00:22:17.652 }, 00:22:17.652 { 00:22:17.652 "name": "BaseBdev4", 00:22:17.652 "uuid": "1c1351a0-edab-51f1-966f-d2863b0a9948", 00:22:17.652 "is_configured": true, 00:22:17.652 "data_offset": 2048, 00:22:17.652 "data_size": 63488 00:22:17.652 } 00:22:17.652 ] 00:22:17.652 }' 00:22:17.652 15:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.652 15:58:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:17.922 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:18.183 [2024-07-12 15:58:38.537371] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:18.183 [2024-07-12 15:58:38.537394] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:18.183 [2024-07-12 15:58:38.540137] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:18.183 [2024-07-12 15:58:38.540164] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:18.183 [2024-07-12 15:58:38.540262] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:18.183 [2024-07-12 15:58:38.540268] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1db5290 name raid_bdev1, state offline 00:22:18.183 0 00:22:18.183 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2616200 00:22:18.183 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 2616200 ']' 00:22:18.183 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 2616200 00:22:18.183 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2616200 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2616200' 00:22:18.184 killing process with pid 2616200 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 2616200 00:22:18.184 [2024-07-12 15:58:38.606144] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:18.184 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 2616200 00:22:18.184 [2024-07-12 15:58:38.623380] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FFodchzXkx 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:18.444 00:22:18.444 real 0m6.295s 00:22:18.444 user 0m10.174s 00:22:18.444 sys 0m0.840s 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:18.444 15:58:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.444 ************************************ 00:22:18.444 END TEST raid_read_error_test 00:22:18.444 ************************************ 00:22:18.444 15:58:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:18.444 15:58:38 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:18.444 15:58:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:18.444 15:58:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:18.444 15:58:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:18.444 ************************************ 00:22:18.444 START TEST raid_write_error_test 00:22:18.444 ************************************ 00:22:18.444 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:22:18.444 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:22:18.444 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:18.444 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.sPwGAItzdt 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2617498 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2617498 /var/tmp/spdk-raid.sock 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 2617498 ']' 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:18.445 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:18.445 15:58:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.705 [2024-07-12 15:58:38.913022] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:22:18.705 [2024-07-12 15:58:38.913082] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617498 ] 00:22:18.705 [2024-07-12 15:58:39.002222] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:18.705 [2024-07-12 15:58:39.066706] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:18.705 [2024-07-12 15:58:39.107867] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:18.705 [2024-07-12 15:58:39.107891] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:19.644 15:58:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:19.644 15:58:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:19.644 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:19.644 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:19.644 BaseBdev1_malloc 00:22:19.644 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:19.905 true 00:22:19.905 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:19.905 [2024-07-12 15:58:40.294206] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:19.905 [2024-07-12 15:58:40.294239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.905 [2024-07-12 15:58:40.294251] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x226daa0 00:22:19.905 [2024-07-12 15:58:40.294257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.905 [2024-07-12 15:58:40.295517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.905 [2024-07-12 15:58:40.295537] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:19.905 BaseBdev1 00:22:19.905 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:19.905 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:20.166 BaseBdev2_malloc 00:22:20.167 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:20.427 true 00:22:20.427 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:20.427 [2024-07-12 15:58:40.849414] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:20.427 [2024-07-12 15:58:40.849441] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.427 [2024-07-12 15:58:40.849452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2272e40 00:22:20.427 [2024-07-12 15:58:40.849458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.427 [2024-07-12 15:58:40.850617] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.427 [2024-07-12 15:58:40.850635] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:20.427 BaseBdev2 00:22:20.427 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:20.427 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:20.687 BaseBdev3_malloc 00:22:20.687 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:20.946 true 00:22:20.946 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:21.207 [2024-07-12 15:58:41.400509] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:21.207 [2024-07-12 15:58:41.400533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.207 [2024-07-12 15:58:41.400543] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22747f0 00:22:21.207 [2024-07-12 15:58:41.400549] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.207 [2024-07-12 15:58:41.401689] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.207 [2024-07-12 15:58:41.401707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:21.207 BaseBdev3 00:22:21.207 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:21.207 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:21.207 BaseBdev4_malloc 00:22:21.207 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:21.469 true 00:22:21.469 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:21.729 [2024-07-12 15:58:41.951419] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:21.729 [2024-07-12 15:58:41.951442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.729 [2024-07-12 15:58:41.951452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22728b0 00:22:21.729 [2024-07-12 15:58:41.951458] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.729 [2024-07-12 15:58:41.952603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.729 [2024-07-12 15:58:41.952620] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:21.729 BaseBdev4 00:22:21.729 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:21.729 [2024-07-12 15:58:42.135908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.729 [2024-07-12 15:58:42.136921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:21.729 [2024-07-12 15:58:42.136974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:21.729 [2024-07-12 15:58:42.137019] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:21.729 [2024-07-12 15:58:42.137196] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2276290 00:22:21.729 [2024-07-12 15:58:42.137204] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:21.729 [2024-07-12 15:58:42.137343] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22767e0 00:22:21.729 [2024-07-12 15:58:42.137462] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2276290 00:22:21.729 [2024-07-12 15:58:42.137468] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2276290 00:22:21.729 [2024-07-12 15:58:42.137543] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.729 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.989 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.989 "name": "raid_bdev1", 00:22:21.989 "uuid": "8173caea-953f-495a-b627-1712801b055d", 00:22:21.989 "strip_size_kb": 0, 00:22:21.989 "state": "online", 00:22:21.989 "raid_level": "raid1", 00:22:21.989 "superblock": true, 00:22:21.989 "num_base_bdevs": 4, 00:22:21.989 "num_base_bdevs_discovered": 4, 00:22:21.989 "num_base_bdevs_operational": 4, 00:22:21.989 "base_bdevs_list": [ 00:22:21.989 { 00:22:21.989 "name": "BaseBdev1", 00:22:21.989 "uuid": "2db6e46c-72f2-5411-a9ba-7c15cf551e31", 00:22:21.989 "is_configured": true, 00:22:21.989 "data_offset": 2048, 00:22:21.989 "data_size": 63488 00:22:21.989 }, 00:22:21.989 { 00:22:21.989 "name": "BaseBdev2", 00:22:21.989 "uuid": "ccaa566c-4797-53a7-b94f-b31b33e572a0", 00:22:21.989 "is_configured": true, 00:22:21.989 "data_offset": 2048, 00:22:21.989 "data_size": 63488 00:22:21.989 }, 00:22:21.989 { 00:22:21.989 "name": "BaseBdev3", 00:22:21.989 "uuid": "f14c173a-06b2-576c-86e7-647d8783fdcd", 00:22:21.989 "is_configured": true, 00:22:21.989 "data_offset": 2048, 00:22:21.989 "data_size": 63488 00:22:21.989 }, 00:22:21.989 { 00:22:21.989 "name": "BaseBdev4", 00:22:21.989 "uuid": "3e2c3407-e978-585e-9c40-24a1322b4758", 00:22:21.989 "is_configured": true, 00:22:21.989 "data_offset": 2048, 00:22:21.989 "data_size": 63488 00:22:21.989 } 00:22:21.989 ] 00:22:21.989 }' 00:22:21.989 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.989 15:58:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:22.558 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:22.558 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:22.558 [2024-07-12 15:58:42.994288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20c9990 00:22:23.496 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:23.756 [2024-07-12 15:58:44.084382] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:23.756 [2024-07-12 15:58:44.084426] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:23.756 [2024-07-12 15:58:44.084623] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x20c9990 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.756 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.017 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.017 "name": "raid_bdev1", 00:22:24.017 "uuid": "8173caea-953f-495a-b627-1712801b055d", 00:22:24.017 "strip_size_kb": 0, 00:22:24.017 "state": "online", 00:22:24.017 "raid_level": "raid1", 00:22:24.017 "superblock": true, 00:22:24.017 "num_base_bdevs": 4, 00:22:24.017 "num_base_bdevs_discovered": 3, 00:22:24.017 "num_base_bdevs_operational": 3, 00:22:24.017 "base_bdevs_list": [ 00:22:24.017 { 00:22:24.017 "name": null, 00:22:24.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:24.017 "is_configured": false, 00:22:24.017 "data_offset": 2048, 00:22:24.017 "data_size": 63488 00:22:24.017 }, 00:22:24.017 { 00:22:24.017 "name": "BaseBdev2", 00:22:24.017 "uuid": "ccaa566c-4797-53a7-b94f-b31b33e572a0", 00:22:24.017 "is_configured": true, 00:22:24.017 "data_offset": 2048, 00:22:24.017 "data_size": 63488 00:22:24.017 }, 00:22:24.017 { 00:22:24.017 "name": "BaseBdev3", 00:22:24.017 "uuid": "f14c173a-06b2-576c-86e7-647d8783fdcd", 00:22:24.017 "is_configured": true, 00:22:24.017 "data_offset": 2048, 00:22:24.017 "data_size": 63488 00:22:24.017 }, 00:22:24.017 { 00:22:24.017 "name": "BaseBdev4", 00:22:24.017 "uuid": "3e2c3407-e978-585e-9c40-24a1322b4758", 00:22:24.017 "is_configured": true, 00:22:24.017 "data_offset": 2048, 00:22:24.017 "data_size": 63488 00:22:24.017 } 00:22:24.017 ] 00:22:24.017 }' 00:22:24.017 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.017 15:58:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.587 15:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:24.587 [2024-07-12 15:58:45.029989] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:24.587 [2024-07-12 15:58:45.030022] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:24.587 [2024-07-12 15:58:45.032642] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:24.587 [2024-07-12 15:58:45.032669] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:24.587 [2024-07-12 15:58:45.032753] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:24.587 [2024-07-12 15:58:45.032760] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2276290 name raid_bdev1, state offline 00:22:24.587 0 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2617498 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 2617498 ']' 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 2617498 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2617498 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2617498' 00:22:24.847 killing process with pid 2617498 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 2617498 00:22:24.847 [2024-07-12 15:58:45.117143] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 2617498 00:22:24.847 [2024-07-12 15:58:45.134219] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.sPwGAItzdt 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:24.847 00:22:24.847 real 0m6.432s 00:22:24.847 user 0m10.389s 00:22:24.847 sys 0m0.868s 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:24.847 15:58:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:24.847 ************************************ 00:22:24.848 END TEST raid_write_error_test 00:22:24.848 ************************************ 00:22:25.107 15:58:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:25.107 15:58:45 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:22:25.107 15:58:45 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:22:25.107 15:58:45 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:25.107 15:58:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:25.107 15:58:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:25.107 15:58:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:25.107 ************************************ 00:22:25.107 START TEST raid_rebuild_test 00:22:25.107 ************************************ 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:25.107 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2618536 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2618536 /var/tmp/spdk-raid.sock 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2618536 ']' 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:25.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:25.108 15:58:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.108 [2024-07-12 15:58:45.407776] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:22:25.108 [2024-07-12 15:58:45.407818] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2618536 ] 00:22:25.108 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:25.108 Zero copy mechanism will not be used. 00:22:25.108 [2024-07-12 15:58:45.493174] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:25.367 [2024-07-12 15:58:45.557351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:25.367 [2024-07-12 15:58:45.599993] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:25.367 [2024-07-12 15:58:45.600017] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:25.936 15:58:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:25.936 15:58:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:22:25.937 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:25.937 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:26.196 BaseBdev1_malloc 00:22:26.196 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:26.197 [2024-07-12 15:58:46.594153] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:26.197 [2024-07-12 15:58:46.594189] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.197 [2024-07-12 15:58:46.594201] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21dc010 00:22:26.197 [2024-07-12 15:58:46.594207] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.197 [2024-07-12 15:58:46.595461] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.197 [2024-07-12 15:58:46.595481] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:26.197 BaseBdev1 00:22:26.197 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:26.197 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:26.456 BaseBdev2_malloc 00:22:26.456 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:26.716 [2024-07-12 15:58:46.960802] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:26.716 [2024-07-12 15:58:46.960827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.716 [2024-07-12 15:58:46.960839] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21dcc30 00:22:26.716 [2024-07-12 15:58:46.960845] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.716 [2024-07-12 15:58:46.961977] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.716 [2024-07-12 15:58:46.961996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:26.716 BaseBdev2 00:22:26.716 15:58:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:26.716 spare_malloc 00:22:26.716 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:26.976 spare_delay 00:22:26.976 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:27.235 [2024-07-12 15:58:47.519813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:27.235 [2024-07-12 15:58:47.519836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.235 [2024-07-12 15:58:47.519846] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238d7f0 00:22:27.235 [2024-07-12 15:58:47.519852] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.235 [2024-07-12 15:58:47.520994] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.235 [2024-07-12 15:58:47.521012] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:27.235 spare 00:22:27.235 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:27.494 [2024-07-12 15:58:47.700284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.494 [2024-07-12 15:58:47.701272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:27.494 [2024-07-12 15:58:47.701332] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21d4d30 00:22:27.494 [2024-07-12 15:58:47.701339] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:27.494 [2024-07-12 15:58:47.701476] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x237f7d0 00:22:27.494 [2024-07-12 15:58:47.701583] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21d4d30 00:22:27.494 [2024-07-12 15:58:47.701589] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21d4d30 00:22:27.494 [2024-07-12 15:58:47.701667] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.494 "name": "raid_bdev1", 00:22:27.494 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:27.494 "strip_size_kb": 0, 00:22:27.494 "state": "online", 00:22:27.494 "raid_level": "raid1", 00:22:27.494 "superblock": false, 00:22:27.494 "num_base_bdevs": 2, 00:22:27.494 "num_base_bdevs_discovered": 2, 00:22:27.494 "num_base_bdevs_operational": 2, 00:22:27.494 "base_bdevs_list": [ 00:22:27.494 { 00:22:27.494 "name": "BaseBdev1", 00:22:27.494 "uuid": "45d94e86-b930-5d9d-9d93-525bf83c6bd2", 00:22:27.494 "is_configured": true, 00:22:27.494 "data_offset": 0, 00:22:27.494 "data_size": 65536 00:22:27.494 }, 00:22:27.494 { 00:22:27.494 "name": "BaseBdev2", 00:22:27.494 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:27.494 "is_configured": true, 00:22:27.494 "data_offset": 0, 00:22:27.494 "data_size": 65536 00:22:27.494 } 00:22:27.494 ] 00:22:27.494 }' 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.494 15:58:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:28.061 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:28.061 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:28.320 [2024-07-12 15:58:48.622814] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:28.320 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:28.320 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.320 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:28.578 15:58:48 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:29.146 [2024-07-12 15:58:49.344475] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23823b0 00:22:29.146 /dev/nbd0 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.146 1+0 records in 00:22:29.146 1+0 records out 00:22:29.146 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311923 s, 13.1 MB/s 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:29.146 15:58:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:33.343 65536+0 records in 00:22:33.343 65536+0 records out 00:22:33.343 33554432 bytes (34 MB, 32 MiB) copied, 4.2732 s, 7.9 MB/s 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:33.343 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:33.602 [2024-07-12 15:58:53.884367] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:33.602 15:58:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:33.862 [2024-07-12 15:58:54.062872] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.862 "name": "raid_bdev1", 00:22:33.862 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:33.862 "strip_size_kb": 0, 00:22:33.862 "state": "online", 00:22:33.862 "raid_level": "raid1", 00:22:33.862 "superblock": false, 00:22:33.862 "num_base_bdevs": 2, 00:22:33.862 "num_base_bdevs_discovered": 1, 00:22:33.862 "num_base_bdevs_operational": 1, 00:22:33.862 "base_bdevs_list": [ 00:22:33.862 { 00:22:33.862 "name": null, 00:22:33.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:33.862 "is_configured": false, 00:22:33.862 "data_offset": 0, 00:22:33.862 "data_size": 65536 00:22:33.862 }, 00:22:33.862 { 00:22:33.862 "name": "BaseBdev2", 00:22:33.862 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:33.862 "is_configured": true, 00:22:33.862 "data_offset": 0, 00:22:33.862 "data_size": 65536 00:22:33.862 } 00:22:33.862 ] 00:22:33.862 }' 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.862 15:58:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:34.431 15:58:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:34.691 [2024-07-12 15:58:55.017364] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:34.691 [2024-07-12 15:58:55.020736] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d3dc0 00:22:34.691 [2024-07-12 15:58:55.022276] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:34.691 15:58:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.630 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.890 "name": "raid_bdev1", 00:22:35.890 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:35.890 "strip_size_kb": 0, 00:22:35.890 "state": "online", 00:22:35.890 "raid_level": "raid1", 00:22:35.890 "superblock": false, 00:22:35.890 "num_base_bdevs": 2, 00:22:35.890 "num_base_bdevs_discovered": 2, 00:22:35.890 "num_base_bdevs_operational": 2, 00:22:35.890 "process": { 00:22:35.890 "type": "rebuild", 00:22:35.890 "target": "spare", 00:22:35.890 "progress": { 00:22:35.890 "blocks": 22528, 00:22:35.890 "percent": 34 00:22:35.890 } 00:22:35.890 }, 00:22:35.890 "base_bdevs_list": [ 00:22:35.890 { 00:22:35.890 "name": "spare", 00:22:35.890 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:35.890 "is_configured": true, 00:22:35.890 "data_offset": 0, 00:22:35.890 "data_size": 65536 00:22:35.890 }, 00:22:35.890 { 00:22:35.890 "name": "BaseBdev2", 00:22:35.890 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:35.890 "is_configured": true, 00:22:35.890 "data_offset": 0, 00:22:35.890 "data_size": 65536 00:22:35.890 } 00:22:35.890 ] 00:22:35.890 }' 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.890 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:36.459 [2024-07-12 15:58:56.844355] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:36.719 [2024-07-12 15:58:56.933446] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:36.719 [2024-07-12 15:58:56.933481] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:36.719 [2024-07-12 15:58:56.933490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:36.719 [2024-07-12 15:58:56.933495] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.719 15:58:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.719 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.719 "name": "raid_bdev1", 00:22:36.719 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:36.719 "strip_size_kb": 0, 00:22:36.719 "state": "online", 00:22:36.719 "raid_level": "raid1", 00:22:36.719 "superblock": false, 00:22:36.719 "num_base_bdevs": 2, 00:22:36.719 "num_base_bdevs_discovered": 1, 00:22:36.719 "num_base_bdevs_operational": 1, 00:22:36.719 "base_bdevs_list": [ 00:22:36.719 { 00:22:36.719 "name": null, 00:22:36.719 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.719 "is_configured": false, 00:22:36.719 "data_offset": 0, 00:22:36.719 "data_size": 65536 00:22:36.719 }, 00:22:36.719 { 00:22:36.719 "name": "BaseBdev2", 00:22:36.719 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:36.719 "is_configured": true, 00:22:36.719 "data_offset": 0, 00:22:36.719 "data_size": 65536 00:22:36.719 } 00:22:36.719 ] 00:22:36.719 }' 00:22:36.719 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.719 15:58:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.289 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.548 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.548 "name": "raid_bdev1", 00:22:37.548 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:37.548 "strip_size_kb": 0, 00:22:37.548 "state": "online", 00:22:37.548 "raid_level": "raid1", 00:22:37.548 "superblock": false, 00:22:37.548 "num_base_bdevs": 2, 00:22:37.548 "num_base_bdevs_discovered": 1, 00:22:37.548 "num_base_bdevs_operational": 1, 00:22:37.548 "base_bdevs_list": [ 00:22:37.548 { 00:22:37.548 "name": null, 00:22:37.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.548 "is_configured": false, 00:22:37.548 "data_offset": 0, 00:22:37.548 "data_size": 65536 00:22:37.548 }, 00:22:37.548 { 00:22:37.548 "name": "BaseBdev2", 00:22:37.548 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:37.548 "is_configured": true, 00:22:37.548 "data_offset": 0, 00:22:37.548 "data_size": 65536 00:22:37.548 } 00:22:37.548 ] 00:22:37.548 }' 00:22:37.548 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.548 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:37.548 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.808 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:37.808 15:58:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:37.808 [2024-07-12 15:58:58.175779] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:37.808 [2024-07-12 15:58:58.179025] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21d3dc0 00:22:37.808 [2024-07-12 15:58:58.180158] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:37.808 15:58:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.191 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.191 "name": "raid_bdev1", 00:22:39.191 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:39.191 "strip_size_kb": 0, 00:22:39.191 "state": "online", 00:22:39.191 "raid_level": "raid1", 00:22:39.191 "superblock": false, 00:22:39.191 "num_base_bdevs": 2, 00:22:39.191 "num_base_bdevs_discovered": 2, 00:22:39.191 "num_base_bdevs_operational": 2, 00:22:39.191 "process": { 00:22:39.191 "type": "rebuild", 00:22:39.191 "target": "spare", 00:22:39.191 "progress": { 00:22:39.191 "blocks": 24576, 00:22:39.191 "percent": 37 00:22:39.191 } 00:22:39.191 }, 00:22:39.191 "base_bdevs_list": [ 00:22:39.191 { 00:22:39.191 "name": "spare", 00:22:39.191 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:39.191 "is_configured": true, 00:22:39.191 "data_offset": 0, 00:22:39.191 "data_size": 65536 00:22:39.192 }, 00:22:39.192 { 00:22:39.192 "name": "BaseBdev2", 00:22:39.192 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:39.192 "is_configured": true, 00:22:39.192 "data_offset": 0, 00:22:39.192 "data_size": 65536 00:22:39.192 } 00:22:39.192 ] 00:22:39.192 }' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=685 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.192 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.538 "name": "raid_bdev1", 00:22:39.538 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:39.538 "strip_size_kb": 0, 00:22:39.538 "state": "online", 00:22:39.538 "raid_level": "raid1", 00:22:39.538 "superblock": false, 00:22:39.538 "num_base_bdevs": 2, 00:22:39.538 "num_base_bdevs_discovered": 2, 00:22:39.538 "num_base_bdevs_operational": 2, 00:22:39.538 "process": { 00:22:39.538 "type": "rebuild", 00:22:39.538 "target": "spare", 00:22:39.538 "progress": { 00:22:39.538 "blocks": 28672, 00:22:39.538 "percent": 43 00:22:39.538 } 00:22:39.538 }, 00:22:39.538 "base_bdevs_list": [ 00:22:39.538 { 00:22:39.538 "name": "spare", 00:22:39.538 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:39.538 "is_configured": true, 00:22:39.538 "data_offset": 0, 00:22:39.538 "data_size": 65536 00:22:39.538 }, 00:22:39.538 { 00:22:39.538 "name": "BaseBdev2", 00:22:39.538 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:39.538 "is_configured": true, 00:22:39.538 "data_offset": 0, 00:22:39.538 "data_size": 65536 00:22:39.538 } 00:22:39.538 ] 00:22:39.538 }' 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:39.538 15:58:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.490 15:59:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:40.751 "name": "raid_bdev1", 00:22:40.751 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:40.751 "strip_size_kb": 0, 00:22:40.751 "state": "online", 00:22:40.751 "raid_level": "raid1", 00:22:40.751 "superblock": false, 00:22:40.751 "num_base_bdevs": 2, 00:22:40.751 "num_base_bdevs_discovered": 2, 00:22:40.751 "num_base_bdevs_operational": 2, 00:22:40.751 "process": { 00:22:40.751 "type": "rebuild", 00:22:40.751 "target": "spare", 00:22:40.751 "progress": { 00:22:40.751 "blocks": 55296, 00:22:40.751 "percent": 84 00:22:40.751 } 00:22:40.751 }, 00:22:40.751 "base_bdevs_list": [ 00:22:40.751 { 00:22:40.751 "name": "spare", 00:22:40.751 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:40.751 "is_configured": true, 00:22:40.751 "data_offset": 0, 00:22:40.751 "data_size": 65536 00:22:40.751 }, 00:22:40.751 { 00:22:40.751 "name": "BaseBdev2", 00:22:40.751 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:40.751 "is_configured": true, 00:22:40.751 "data_offset": 0, 00:22:40.751 "data_size": 65536 00:22:40.751 } 00:22:40.751 ] 00:22:40.751 }' 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:40.751 15:59:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:41.009 [2024-07-12 15:59:01.398747] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:41.009 [2024-07-12 15:59:01.398797] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:41.009 [2024-07-12 15:59:01.398828] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:41.944 "name": "raid_bdev1", 00:22:41.944 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:41.944 "strip_size_kb": 0, 00:22:41.944 "state": "online", 00:22:41.944 "raid_level": "raid1", 00:22:41.944 "superblock": false, 00:22:41.944 "num_base_bdevs": 2, 00:22:41.944 "num_base_bdevs_discovered": 2, 00:22:41.944 "num_base_bdevs_operational": 2, 00:22:41.944 "base_bdevs_list": [ 00:22:41.944 { 00:22:41.944 "name": "spare", 00:22:41.944 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:41.944 "is_configured": true, 00:22:41.944 "data_offset": 0, 00:22:41.944 "data_size": 65536 00:22:41.944 }, 00:22:41.944 { 00:22:41.944 "name": "BaseBdev2", 00:22:41.944 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:41.944 "is_configured": true, 00:22:41.944 "data_offset": 0, 00:22:41.944 "data_size": 65536 00:22:41.944 } 00:22:41.944 ] 00:22:41.944 }' 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:41.944 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:42.204 "name": "raid_bdev1", 00:22:42.204 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:42.204 "strip_size_kb": 0, 00:22:42.204 "state": "online", 00:22:42.204 "raid_level": "raid1", 00:22:42.204 "superblock": false, 00:22:42.204 "num_base_bdevs": 2, 00:22:42.204 "num_base_bdevs_discovered": 2, 00:22:42.204 "num_base_bdevs_operational": 2, 00:22:42.204 "base_bdevs_list": [ 00:22:42.204 { 00:22:42.204 "name": "spare", 00:22:42.204 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:42.204 "is_configured": true, 00:22:42.204 "data_offset": 0, 00:22:42.204 "data_size": 65536 00:22:42.204 }, 00:22:42.204 { 00:22:42.204 "name": "BaseBdev2", 00:22:42.204 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:42.204 "is_configured": true, 00:22:42.204 "data_offset": 0, 00:22:42.204 "data_size": 65536 00:22:42.204 } 00:22:42.204 ] 00:22:42.204 }' 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:42.204 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.463 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.463 "name": "raid_bdev1", 00:22:42.463 "uuid": "e078e4e4-e089-464d-9b6b-52e721e5050c", 00:22:42.463 "strip_size_kb": 0, 00:22:42.463 "state": "online", 00:22:42.463 "raid_level": "raid1", 00:22:42.463 "superblock": false, 00:22:42.463 "num_base_bdevs": 2, 00:22:42.463 "num_base_bdevs_discovered": 2, 00:22:42.463 "num_base_bdevs_operational": 2, 00:22:42.463 "base_bdevs_list": [ 00:22:42.463 { 00:22:42.463 "name": "spare", 00:22:42.463 "uuid": "586c86e6-f027-54b8-9a22-5936a6ec88f5", 00:22:42.463 "is_configured": true, 00:22:42.463 "data_offset": 0, 00:22:42.463 "data_size": 65536 00:22:42.463 }, 00:22:42.463 { 00:22:42.463 "name": "BaseBdev2", 00:22:42.464 "uuid": "8d3a0160-3485-5cd6-a989-b366b56acf0f", 00:22:42.464 "is_configured": true, 00:22:42.464 "data_offset": 0, 00:22:42.464 "data_size": 65536 00:22:42.464 } 00:22:42.464 ] 00:22:42.464 }' 00:22:42.464 15:59:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.464 15:59:02 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.028 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:43.287 [2024-07-12 15:59:03.588568] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:43.287 [2024-07-12 15:59:03.588590] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:43.287 [2024-07-12 15:59:03.588637] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:43.287 [2024-07-12 15:59:03.588679] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:43.287 [2024-07-12 15:59:03.588685] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21d4d30 name raid_bdev1, state offline 00:22:43.287 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:43.287 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:43.548 15:59:03 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:44.118 /dev/nbd0 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:44.118 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:44.118 1+0 records in 00:22:44.118 1+0 records out 00:22:44.118 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278659 s, 14.7 MB/s 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:44.119 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:44.119 /dev/nbd1 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:44.380 1+0 records in 00:22:44.380 1+0 records out 00:22:44.380 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272447 s, 15.0 MB/s 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:44.380 15:59:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:44.950 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2618536 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2618536 ']' 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2618536 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2618536 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2618536' 00:22:45.519 killing process with pid 2618536 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2618536 00:22:45.519 Received shutdown signal, test time was about 60.000000 seconds 00:22:45.519 00:22:45.519 Latency(us) 00:22:45.519 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:45.519 =================================================================================================================== 00:22:45.519 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:45.519 [2024-07-12 15:59:05.810550] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2618536 00:22:45.519 [2024-07-12 15:59:05.825253] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:22:45.519 00:22:45.519 real 0m20.601s 00:22:45.519 user 0m29.346s 00:22:45.519 sys 0m3.367s 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:45.519 15:59:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:45.519 ************************************ 00:22:45.519 END TEST raid_rebuild_test 00:22:45.519 ************************************ 00:22:45.780 15:59:05 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:45.780 15:59:05 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:22:45.780 15:59:05 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:22:45.780 15:59:05 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:45.780 15:59:05 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:45.780 ************************************ 00:22:45.780 START TEST raid_rebuild_test_sb 00:22:45.780 ************************************ 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2622158 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2622158 /var/tmp/spdk-raid.sock 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2622158 ']' 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:45.780 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:45.780 15:59:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.780 [2024-07-12 15:59:06.089066] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:22:45.780 [2024-07-12 15:59:06.089118] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2622158 ] 00:22:45.780 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:45.780 Zero copy mechanism will not be used. 00:22:45.780 [2024-07-12 15:59:06.177543] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:46.039 [2024-07-12 15:59:06.245942] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:46.039 [2024-07-12 15:59:06.286596] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.039 [2024-07-12 15:59:06.286618] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:46.979 15:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:46.979 15:59:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:22:46.979 15:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:46.979 15:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:47.548 BaseBdev1_malloc 00:22:47.548 15:59:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:48.119 [2024-07-12 15:59:08.331269] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:48.119 [2024-07-12 15:59:08.331305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:48.119 [2024-07-12 15:59:08.331320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xea9010 00:22:48.119 [2024-07-12 15:59:08.331326] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:48.119 [2024-07-12 15:59:08.332723] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:48.119 [2024-07-12 15:59:08.332744] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:48.119 BaseBdev1 00:22:48.119 15:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:48.119 15:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:48.688 BaseBdev2_malloc 00:22:48.688 15:59:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:49.259 [2024-07-12 15:59:09.415920] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:49.259 [2024-07-12 15:59:09.415951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:49.259 [2024-07-12 15:59:09.415965] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xea9c30 00:22:49.259 [2024-07-12 15:59:09.415971] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:49.259 [2024-07-12 15:59:09.417150] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:49.259 [2024-07-12 15:59:09.417169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:49.259 BaseBdev2 00:22:49.259 15:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:49.520 spare_malloc 00:22:49.780 15:59:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:50.349 spare_delay 00:22:50.349 15:59:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:50.609 [2024-07-12 15:59:11.041884] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:50.609 [2024-07-12 15:59:11.041915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:50.609 [2024-07-12 15:59:11.041929] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x105a7f0 00:22:50.609 [2024-07-12 15:59:11.041935] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:50.609 [2024-07-12 15:59:11.043198] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:50.609 [2024-07-12 15:59:11.043216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:50.609 spare 00:22:50.868 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:51.438 [2024-07-12 15:59:11.583244] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:51.438 [2024-07-12 15:59:11.584252] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:51.438 [2024-07-12 15:59:11.584371] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xea1d30 00:22:51.438 [2024-07-12 15:59:11.584383] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:51.438 [2024-07-12 15:59:11.584536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea19c0 00:22:51.438 [2024-07-12 15:59:11.584646] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xea1d30 00:22:51.438 [2024-07-12 15:59:11.584652] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xea1d30 00:22:51.438 [2024-07-12 15:59:11.584729] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.438 15:59:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.008 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.008 "name": "raid_bdev1", 00:22:52.008 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:22:52.008 "strip_size_kb": 0, 00:22:52.008 "state": "online", 00:22:52.008 "raid_level": "raid1", 00:22:52.008 "superblock": true, 00:22:52.008 "num_base_bdevs": 2, 00:22:52.008 "num_base_bdevs_discovered": 2, 00:22:52.008 "num_base_bdevs_operational": 2, 00:22:52.008 "base_bdevs_list": [ 00:22:52.008 { 00:22:52.008 "name": "BaseBdev1", 00:22:52.008 "uuid": "25a0abc1-ed1c-565f-a681-e25c22a92f0d", 00:22:52.008 "is_configured": true, 00:22:52.008 "data_offset": 2048, 00:22:52.008 "data_size": 63488 00:22:52.008 }, 00:22:52.008 { 00:22:52.008 "name": "BaseBdev2", 00:22:52.008 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:22:52.008 "is_configured": true, 00:22:52.008 "data_offset": 2048, 00:22:52.008 "data_size": 63488 00:22:52.008 } 00:22:52.008 ] 00:22:52.008 }' 00:22:52.008 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.008 15:59:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:52.267 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:52.267 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:52.527 [2024-07-12 15:59:12.878693] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:52.527 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:52.527 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.527 15:59:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:52.786 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:53.047 [2024-07-12 15:59:13.267521] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea19c0 00:22:53.047 /dev/nbd0 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:53.047 1+0 records in 00:22:53.047 1+0 records out 00:22:53.047 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272669 s, 15.0 MB/s 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:22:53.047 15:59:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:58.328 63488+0 records in 00:22:58.328 63488+0 records out 00:22:58.328 32505856 bytes (33 MB, 31 MiB) copied, 4.58927 s, 7.1 MB/s 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:58.328 15:59:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:58.329 [2024-07-12 15:59:18.104498] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:58.329 [2024-07-12 15:59:18.283819] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.329 "name": "raid_bdev1", 00:22:58.329 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:22:58.329 "strip_size_kb": 0, 00:22:58.329 "state": "online", 00:22:58.329 "raid_level": "raid1", 00:22:58.329 "superblock": true, 00:22:58.329 "num_base_bdevs": 2, 00:22:58.329 "num_base_bdevs_discovered": 1, 00:22:58.329 "num_base_bdevs_operational": 1, 00:22:58.329 "base_bdevs_list": [ 00:22:58.329 { 00:22:58.329 "name": null, 00:22:58.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:58.329 "is_configured": false, 00:22:58.329 "data_offset": 2048, 00:22:58.329 "data_size": 63488 00:22:58.329 }, 00:22:58.329 { 00:22:58.329 "name": "BaseBdev2", 00:22:58.329 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:22:58.329 "is_configured": true, 00:22:58.329 "data_offset": 2048, 00:22:58.329 "data_size": 63488 00:22:58.329 } 00:22:58.329 ] 00:22:58.329 }' 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.329 15:59:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:58.589 15:59:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:58.848 [2024-07-12 15:59:19.198137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:58.848 [2024-07-12 15:59:19.201581] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea07e0 00:22:58.848 [2024-07-12 15:59:19.203119] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:58.848 15:59:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.788 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:00.049 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:00.049 "name": "raid_bdev1", 00:23:00.049 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:00.049 "strip_size_kb": 0, 00:23:00.049 "state": "online", 00:23:00.049 "raid_level": "raid1", 00:23:00.049 "superblock": true, 00:23:00.049 "num_base_bdevs": 2, 00:23:00.049 "num_base_bdevs_discovered": 2, 00:23:00.049 "num_base_bdevs_operational": 2, 00:23:00.049 "process": { 00:23:00.049 "type": "rebuild", 00:23:00.049 "target": "spare", 00:23:00.049 "progress": { 00:23:00.049 "blocks": 22528, 00:23:00.049 "percent": 35 00:23:00.049 } 00:23:00.049 }, 00:23:00.049 "base_bdevs_list": [ 00:23:00.049 { 00:23:00.049 "name": "spare", 00:23:00.049 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:00.049 "is_configured": true, 00:23:00.049 "data_offset": 2048, 00:23:00.049 "data_size": 63488 00:23:00.049 }, 00:23:00.049 { 00:23:00.049 "name": "BaseBdev2", 00:23:00.049 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:00.049 "is_configured": true, 00:23:00.049 "data_offset": 2048, 00:23:00.049 "data_size": 63488 00:23:00.049 } 00:23:00.049 ] 00:23:00.049 }' 00:23:00.049 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:00.049 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:00.049 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:00.309 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:00.309 15:59:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:00.880 [2024-07-12 15:59:21.021731] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:00.880 [2024-07-12 15:59:21.114265] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:00.880 [2024-07-12 15:59:21.114301] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:00.880 [2024-07-12 15:59:21.114311] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:00.880 [2024-07-12 15:59:21.114316] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.880 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.139 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.139 "name": "raid_bdev1", 00:23:01.139 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:01.139 "strip_size_kb": 0, 00:23:01.139 "state": "online", 00:23:01.139 "raid_level": "raid1", 00:23:01.139 "superblock": true, 00:23:01.139 "num_base_bdevs": 2, 00:23:01.139 "num_base_bdevs_discovered": 1, 00:23:01.139 "num_base_bdevs_operational": 1, 00:23:01.139 "base_bdevs_list": [ 00:23:01.139 { 00:23:01.139 "name": null, 00:23:01.139 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.139 "is_configured": false, 00:23:01.139 "data_offset": 2048, 00:23:01.139 "data_size": 63488 00:23:01.139 }, 00:23:01.139 { 00:23:01.139 "name": "BaseBdev2", 00:23:01.139 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:01.139 "is_configured": true, 00:23:01.139 "data_offset": 2048, 00:23:01.139 "data_size": 63488 00:23:01.139 } 00:23:01.139 ] 00:23:01.139 }' 00:23:01.139 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.139 15:59:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.709 15:59:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.709 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:01.709 "name": "raid_bdev1", 00:23:01.709 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:01.709 "strip_size_kb": 0, 00:23:01.709 "state": "online", 00:23:01.709 "raid_level": "raid1", 00:23:01.709 "superblock": true, 00:23:01.709 "num_base_bdevs": 2, 00:23:01.709 "num_base_bdevs_discovered": 1, 00:23:01.709 "num_base_bdevs_operational": 1, 00:23:01.709 "base_bdevs_list": [ 00:23:01.709 { 00:23:01.709 "name": null, 00:23:01.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.709 "is_configured": false, 00:23:01.709 "data_offset": 2048, 00:23:01.709 "data_size": 63488 00:23:01.709 }, 00:23:01.709 { 00:23:01.709 "name": "BaseBdev2", 00:23:01.709 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:01.709 "is_configured": true, 00:23:01.709 "data_offset": 2048, 00:23:01.709 "data_size": 63488 00:23:01.709 } 00:23:01.709 ] 00:23:01.709 }' 00:23:01.710 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:01.710 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:01.710 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:01.970 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:01.970 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:01.970 [2024-07-12 15:59:22.357185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:01.970 [2024-07-12 15:59:22.360478] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea1720 00:23:01.970 [2024-07-12 15:59:22.361605] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:01.970 15:59:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.977 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:03.239 "name": "raid_bdev1", 00:23:03.239 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:03.239 "strip_size_kb": 0, 00:23:03.239 "state": "online", 00:23:03.239 "raid_level": "raid1", 00:23:03.239 "superblock": true, 00:23:03.239 "num_base_bdevs": 2, 00:23:03.239 "num_base_bdevs_discovered": 2, 00:23:03.239 "num_base_bdevs_operational": 2, 00:23:03.239 "process": { 00:23:03.239 "type": "rebuild", 00:23:03.239 "target": "spare", 00:23:03.239 "progress": { 00:23:03.239 "blocks": 22528, 00:23:03.239 "percent": 35 00:23:03.239 } 00:23:03.239 }, 00:23:03.239 "base_bdevs_list": [ 00:23:03.239 { 00:23:03.239 "name": "spare", 00:23:03.239 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:03.239 "is_configured": true, 00:23:03.239 "data_offset": 2048, 00:23:03.239 "data_size": 63488 00:23:03.239 }, 00:23:03.239 { 00:23:03.239 "name": "BaseBdev2", 00:23:03.239 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:03.239 "is_configured": true, 00:23:03.239 "data_offset": 2048, 00:23:03.239 "data_size": 63488 00:23:03.239 } 00:23:03.239 ] 00:23:03.239 }' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:03.239 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=709 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:03.239 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:03.499 "name": "raid_bdev1", 00:23:03.499 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:03.499 "strip_size_kb": 0, 00:23:03.499 "state": "online", 00:23:03.499 "raid_level": "raid1", 00:23:03.499 "superblock": true, 00:23:03.499 "num_base_bdevs": 2, 00:23:03.499 "num_base_bdevs_discovered": 2, 00:23:03.499 "num_base_bdevs_operational": 2, 00:23:03.499 "process": { 00:23:03.499 "type": "rebuild", 00:23:03.499 "target": "spare", 00:23:03.499 "progress": { 00:23:03.499 "blocks": 28672, 00:23:03.499 "percent": 45 00:23:03.499 } 00:23:03.499 }, 00:23:03.499 "base_bdevs_list": [ 00:23:03.499 { 00:23:03.499 "name": "spare", 00:23:03.499 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:03.499 "is_configured": true, 00:23:03.499 "data_offset": 2048, 00:23:03.499 "data_size": 63488 00:23:03.499 }, 00:23:03.499 { 00:23:03.499 "name": "BaseBdev2", 00:23:03.499 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:03.499 "is_configured": true, 00:23:03.499 "data_offset": 2048, 00:23:03.499 "data_size": 63488 00:23:03.499 } 00:23:03.499 ] 00:23:03.499 }' 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:03.499 15:59:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.881 15:59:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.881 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:04.881 "name": "raid_bdev1", 00:23:04.881 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:04.881 "strip_size_kb": 0, 00:23:04.881 "state": "online", 00:23:04.881 "raid_level": "raid1", 00:23:04.881 "superblock": true, 00:23:04.881 "num_base_bdevs": 2, 00:23:04.881 "num_base_bdevs_discovered": 2, 00:23:04.881 "num_base_bdevs_operational": 2, 00:23:04.881 "process": { 00:23:04.881 "type": "rebuild", 00:23:04.881 "target": "spare", 00:23:04.881 "progress": { 00:23:04.881 "blocks": 55296, 00:23:04.881 "percent": 87 00:23:04.881 } 00:23:04.881 }, 00:23:04.881 "base_bdevs_list": [ 00:23:04.881 { 00:23:04.881 "name": "spare", 00:23:04.881 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:04.881 "is_configured": true, 00:23:04.881 "data_offset": 2048, 00:23:04.881 "data_size": 63488 00:23:04.881 }, 00:23:04.881 { 00:23:04.881 "name": "BaseBdev2", 00:23:04.881 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:04.881 "is_configured": true, 00:23:04.881 "data_offset": 2048, 00:23:04.881 "data_size": 63488 00:23:04.881 } 00:23:04.881 ] 00:23:04.881 }' 00:23:04.882 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:04.882 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:04.882 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:04.882 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:04.882 15:59:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:05.142 [2024-07-12 15:59:25.479565] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:05.142 [2024-07-12 15:59:25.479610] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:05.142 [2024-07-12 15:59:25.479672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.082 "name": "raid_bdev1", 00:23:06.082 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:06.082 "strip_size_kb": 0, 00:23:06.082 "state": "online", 00:23:06.082 "raid_level": "raid1", 00:23:06.082 "superblock": true, 00:23:06.082 "num_base_bdevs": 2, 00:23:06.082 "num_base_bdevs_discovered": 2, 00:23:06.082 "num_base_bdevs_operational": 2, 00:23:06.082 "base_bdevs_list": [ 00:23:06.082 { 00:23:06.082 "name": "spare", 00:23:06.082 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:06.082 "is_configured": true, 00:23:06.082 "data_offset": 2048, 00:23:06.082 "data_size": 63488 00:23:06.082 }, 00:23:06.082 { 00:23:06.082 "name": "BaseBdev2", 00:23:06.082 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:06.082 "is_configured": true, 00:23:06.082 "data_offset": 2048, 00:23:06.082 "data_size": 63488 00:23:06.082 } 00:23:06.082 ] 00:23:06.082 }' 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.082 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:06.343 "name": "raid_bdev1", 00:23:06.343 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:06.343 "strip_size_kb": 0, 00:23:06.343 "state": "online", 00:23:06.343 "raid_level": "raid1", 00:23:06.343 "superblock": true, 00:23:06.343 "num_base_bdevs": 2, 00:23:06.343 "num_base_bdevs_discovered": 2, 00:23:06.343 "num_base_bdevs_operational": 2, 00:23:06.343 "base_bdevs_list": [ 00:23:06.343 { 00:23:06.343 "name": "spare", 00:23:06.343 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:06.343 "is_configured": true, 00:23:06.343 "data_offset": 2048, 00:23:06.343 "data_size": 63488 00:23:06.343 }, 00:23:06.343 { 00:23:06.343 "name": "BaseBdev2", 00:23:06.343 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:06.343 "is_configured": true, 00:23:06.343 "data_offset": 2048, 00:23:06.343 "data_size": 63488 00:23:06.343 } 00:23:06.343 ] 00:23:06.343 }' 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:06.343 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:06.603 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:06.603 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:06.603 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:06.603 "name": "raid_bdev1", 00:23:06.603 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:06.603 "strip_size_kb": 0, 00:23:06.603 "state": "online", 00:23:06.603 "raid_level": "raid1", 00:23:06.603 "superblock": true, 00:23:06.603 "num_base_bdevs": 2, 00:23:06.603 "num_base_bdevs_discovered": 2, 00:23:06.603 "num_base_bdevs_operational": 2, 00:23:06.603 "base_bdevs_list": [ 00:23:06.603 { 00:23:06.603 "name": "spare", 00:23:06.603 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:06.603 "is_configured": true, 00:23:06.603 "data_offset": 2048, 00:23:06.603 "data_size": 63488 00:23:06.603 }, 00:23:06.603 { 00:23:06.603 "name": "BaseBdev2", 00:23:06.603 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:06.603 "is_configured": true, 00:23:06.603 "data_offset": 2048, 00:23:06.603 "data_size": 63488 00:23:06.603 } 00:23:06.603 ] 00:23:06.603 }' 00:23:06.603 15:59:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:06.603 15:59:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:07.174 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:07.434 [2024-07-12 15:59:27.661259] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:07.434 [2024-07-12 15:59:27.661277] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:07.434 [2024-07-12 15:59:27.661321] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:07.434 [2024-07-12 15:59:27.661360] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:07.434 [2024-07-12 15:59:27.661366] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xea1d30 name raid_bdev1, state offline 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:07.434 15:59:27 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:07.694 /dev/nbd0 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:07.694 1+0 records in 00:23:07.694 1+0 records out 00:23:07.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271558 s, 15.1 MB/s 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:07.694 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:07.954 /dev/nbd1 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:07.954 1+0 records in 00:23:07.954 1+0 records out 00:23:07.954 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000323188 s, 12.7 MB/s 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:07.954 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:08.213 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:08.473 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:08.734 15:59:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:08.734 [2024-07-12 15:59:29.146195] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:08.734 [2024-07-12 15:59:29.146228] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.734 [2024-07-12 15:59:29.146241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe9fd30 00:23:08.734 [2024-07-12 15:59:29.146247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.734 [2024-07-12 15:59:29.147581] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.734 [2024-07-12 15:59:29.147602] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:08.734 [2024-07-12 15:59:29.147659] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:08.734 [2024-07-12 15:59:29.147680] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:08.734 [2024-07-12 15:59:29.147764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:08.734 spare 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.734 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.993 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.993 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.993 [2024-07-12 15:59:29.248055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1051e10 00:23:08.993 [2024-07-12 15:59:29.248066] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:08.993 [2024-07-12 15:59:29.248231] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea9a30 00:23:08.993 [2024-07-12 15:59:29.248352] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1051e10 00:23:08.993 [2024-07-12 15:59:29.248358] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1051e10 00:23:08.993 [2024-07-12 15:59:29.248446] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.993 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.993 "name": "raid_bdev1", 00:23:08.993 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:08.993 "strip_size_kb": 0, 00:23:08.993 "state": "online", 00:23:08.993 "raid_level": "raid1", 00:23:08.993 "superblock": true, 00:23:08.993 "num_base_bdevs": 2, 00:23:08.993 "num_base_bdevs_discovered": 2, 00:23:08.993 "num_base_bdevs_operational": 2, 00:23:08.993 "base_bdevs_list": [ 00:23:08.993 { 00:23:08.993 "name": "spare", 00:23:08.993 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:08.993 "is_configured": true, 00:23:08.993 "data_offset": 2048, 00:23:08.993 "data_size": 63488 00:23:08.993 }, 00:23:08.993 { 00:23:08.993 "name": "BaseBdev2", 00:23:08.993 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:08.993 "is_configured": true, 00:23:08.993 "data_offset": 2048, 00:23:08.993 "data_size": 63488 00:23:08.994 } 00:23:08.994 ] 00:23:08.994 }' 00:23:08.994 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.994 15:59:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.584 15:59:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.844 "name": "raid_bdev1", 00:23:09.844 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:09.844 "strip_size_kb": 0, 00:23:09.844 "state": "online", 00:23:09.844 "raid_level": "raid1", 00:23:09.844 "superblock": true, 00:23:09.844 "num_base_bdevs": 2, 00:23:09.844 "num_base_bdevs_discovered": 2, 00:23:09.844 "num_base_bdevs_operational": 2, 00:23:09.844 "base_bdevs_list": [ 00:23:09.844 { 00:23:09.844 "name": "spare", 00:23:09.844 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:09.844 "is_configured": true, 00:23:09.844 "data_offset": 2048, 00:23:09.844 "data_size": 63488 00:23:09.844 }, 00:23:09.844 { 00:23:09.844 "name": "BaseBdev2", 00:23:09.844 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:09.844 "is_configured": true, 00:23:09.844 "data_offset": 2048, 00:23:09.844 "data_size": 63488 00:23:09.844 } 00:23:09.844 ] 00:23:09.844 }' 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.844 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:10.104 [2024-07-12 15:59:30.521744] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.104 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.364 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:10.364 "name": "raid_bdev1", 00:23:10.364 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:10.364 "strip_size_kb": 0, 00:23:10.364 "state": "online", 00:23:10.364 "raid_level": "raid1", 00:23:10.364 "superblock": true, 00:23:10.364 "num_base_bdevs": 2, 00:23:10.364 "num_base_bdevs_discovered": 1, 00:23:10.364 "num_base_bdevs_operational": 1, 00:23:10.364 "base_bdevs_list": [ 00:23:10.364 { 00:23:10.364 "name": null, 00:23:10.364 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.364 "is_configured": false, 00:23:10.364 "data_offset": 2048, 00:23:10.364 "data_size": 63488 00:23:10.364 }, 00:23:10.364 { 00:23:10.364 "name": "BaseBdev2", 00:23:10.364 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:10.364 "is_configured": true, 00:23:10.364 "data_offset": 2048, 00:23:10.364 "data_size": 63488 00:23:10.364 } 00:23:10.364 ] 00:23:10.364 }' 00:23:10.364 15:59:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:10.364 15:59:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:10.934 15:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:11.194 [2024-07-12 15:59:31.436063] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:11.194 [2024-07-12 15:59:31.436164] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:11.194 [2024-07-12 15:59:31.436173] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:11.194 [2024-07-12 15:59:31.436191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:11.194 [2024-07-12 15:59:31.439487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea0da0 00:23:11.194 [2024-07-12 15:59:31.440551] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:11.194 15:59:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.133 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.393 "name": "raid_bdev1", 00:23:12.393 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:12.393 "strip_size_kb": 0, 00:23:12.393 "state": "online", 00:23:12.393 "raid_level": "raid1", 00:23:12.393 "superblock": true, 00:23:12.393 "num_base_bdevs": 2, 00:23:12.393 "num_base_bdevs_discovered": 2, 00:23:12.393 "num_base_bdevs_operational": 2, 00:23:12.393 "process": { 00:23:12.393 "type": "rebuild", 00:23:12.393 "target": "spare", 00:23:12.393 "progress": { 00:23:12.393 "blocks": 22528, 00:23:12.393 "percent": 35 00:23:12.393 } 00:23:12.393 }, 00:23:12.393 "base_bdevs_list": [ 00:23:12.393 { 00:23:12.393 "name": "spare", 00:23:12.393 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:12.393 "is_configured": true, 00:23:12.393 "data_offset": 2048, 00:23:12.393 "data_size": 63488 00:23:12.393 }, 00:23:12.393 { 00:23:12.393 "name": "BaseBdev2", 00:23:12.393 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:12.393 "is_configured": true, 00:23:12.393 "data_offset": 2048, 00:23:12.393 "data_size": 63488 00:23:12.393 } 00:23:12.393 ] 00:23:12.393 }' 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:12.393 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:12.652 [2024-07-12 15:59:32.909317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:12.652 [2024-07-12 15:59:32.949383] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:12.652 [2024-07-12 15:59:32.949413] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:12.652 [2024-07-12 15:59:32.949423] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:12.652 [2024-07-12 15:59:32.949427] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.652 15:59:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.912 15:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:12.912 "name": "raid_bdev1", 00:23:12.912 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:12.912 "strip_size_kb": 0, 00:23:12.912 "state": "online", 00:23:12.912 "raid_level": "raid1", 00:23:12.912 "superblock": true, 00:23:12.912 "num_base_bdevs": 2, 00:23:12.912 "num_base_bdevs_discovered": 1, 00:23:12.912 "num_base_bdevs_operational": 1, 00:23:12.912 "base_bdevs_list": [ 00:23:12.912 { 00:23:12.912 "name": null, 00:23:12.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:12.912 "is_configured": false, 00:23:12.912 "data_offset": 2048, 00:23:12.912 "data_size": 63488 00:23:12.912 }, 00:23:12.912 { 00:23:12.912 "name": "BaseBdev2", 00:23:12.912 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:12.912 "is_configured": true, 00:23:12.912 "data_offset": 2048, 00:23:12.912 "data_size": 63488 00:23:12.912 } 00:23:12.912 ] 00:23:12.912 }' 00:23:12.912 15:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:12.912 15:59:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:13.479 15:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:13.479 [2024-07-12 15:59:33.879762] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:13.479 [2024-07-12 15:59:33.879795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:13.479 [2024-07-12 15:59:33.879808] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xea13c0 00:23:13.479 [2024-07-12 15:59:33.879814] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:13.479 [2024-07-12 15:59:33.880112] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:13.479 [2024-07-12 15:59:33.880123] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:13.479 [2024-07-12 15:59:33.880180] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:13.479 [2024-07-12 15:59:33.880187] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:13.480 [2024-07-12 15:59:33.880192] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:13.480 [2024-07-12 15:59:33.880204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:13.480 [2024-07-12 15:59:33.883545] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb81fc0 00:23:13.480 [2024-07-12 15:59:33.884603] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:13.480 spare 00:23:13.480 15:59:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.860 15:59:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.860 "name": "raid_bdev1", 00:23:14.860 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:14.860 "strip_size_kb": 0, 00:23:14.860 "state": "online", 00:23:14.860 "raid_level": "raid1", 00:23:14.860 "superblock": true, 00:23:14.860 "num_base_bdevs": 2, 00:23:14.860 "num_base_bdevs_discovered": 2, 00:23:14.860 "num_base_bdevs_operational": 2, 00:23:14.860 "process": { 00:23:14.860 "type": "rebuild", 00:23:14.860 "target": "spare", 00:23:14.860 "progress": { 00:23:14.860 "blocks": 22528, 00:23:14.860 "percent": 35 00:23:14.860 } 00:23:14.860 }, 00:23:14.860 "base_bdevs_list": [ 00:23:14.860 { 00:23:14.860 "name": "spare", 00:23:14.860 "uuid": "a7e3aba5-7818-57d3-b928-74156b6fac93", 00:23:14.860 "is_configured": true, 00:23:14.860 "data_offset": 2048, 00:23:14.860 "data_size": 63488 00:23:14.860 }, 00:23:14.860 { 00:23:14.860 "name": "BaseBdev2", 00:23:14.860 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:14.860 "is_configured": true, 00:23:14.860 "data_offset": 2048, 00:23:14.860 "data_size": 63488 00:23:14.860 } 00:23:14.860 ] 00:23:14.860 }' 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:14.860 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:15.120 [2024-07-12 15:59:35.369072] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:15.120 [2024-07-12 15:59:35.393391] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:15.120 [2024-07-12 15:59:35.393422] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:15.120 [2024-07-12 15:59:35.393432] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:15.120 [2024-07-12 15:59:35.393436] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.120 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.380 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:15.380 "name": "raid_bdev1", 00:23:15.380 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:15.380 "strip_size_kb": 0, 00:23:15.380 "state": "online", 00:23:15.380 "raid_level": "raid1", 00:23:15.380 "superblock": true, 00:23:15.380 "num_base_bdevs": 2, 00:23:15.380 "num_base_bdevs_discovered": 1, 00:23:15.380 "num_base_bdevs_operational": 1, 00:23:15.380 "base_bdevs_list": [ 00:23:15.380 { 00:23:15.380 "name": null, 00:23:15.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.380 "is_configured": false, 00:23:15.380 "data_offset": 2048, 00:23:15.380 "data_size": 63488 00:23:15.380 }, 00:23:15.380 { 00:23:15.380 "name": "BaseBdev2", 00:23:15.380 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:15.380 "is_configured": true, 00:23:15.380 "data_offset": 2048, 00:23:15.380 "data_size": 63488 00:23:15.380 } 00:23:15.380 ] 00:23:15.380 }' 00:23:15.380 15:59:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:15.380 15:59:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.948 "name": "raid_bdev1", 00:23:15.948 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:15.948 "strip_size_kb": 0, 00:23:15.948 "state": "online", 00:23:15.948 "raid_level": "raid1", 00:23:15.948 "superblock": true, 00:23:15.948 "num_base_bdevs": 2, 00:23:15.948 "num_base_bdevs_discovered": 1, 00:23:15.948 "num_base_bdevs_operational": 1, 00:23:15.948 "base_bdevs_list": [ 00:23:15.948 { 00:23:15.948 "name": null, 00:23:15.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:15.948 "is_configured": false, 00:23:15.948 "data_offset": 2048, 00:23:15.948 "data_size": 63488 00:23:15.948 }, 00:23:15.948 { 00:23:15.948 "name": "BaseBdev2", 00:23:15.948 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:15.948 "is_configured": true, 00:23:15.948 "data_offset": 2048, 00:23:15.948 "data_size": 63488 00:23:15.948 } 00:23:15.948 ] 00:23:15.948 }' 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:15.948 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.207 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:16.207 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:16.207 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:16.466 [2024-07-12 15:59:36.796719] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:16.466 [2024-07-12 15:59:36.796749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.466 [2024-07-12 15:59:36.796762] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xea9240 00:23:16.466 [2024-07-12 15:59:36.796768] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.466 [2024-07-12 15:59:36.797039] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.466 [2024-07-12 15:59:36.797050] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:16.466 [2024-07-12 15:59:36.797091] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:16.466 [2024-07-12 15:59:36.797098] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:16.466 [2024-07-12 15:59:36.797103] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:16.466 BaseBdev1 00:23:16.466 15:59:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.401 15:59:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.659 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:17.659 "name": "raid_bdev1", 00:23:17.659 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:17.659 "strip_size_kb": 0, 00:23:17.659 "state": "online", 00:23:17.659 "raid_level": "raid1", 00:23:17.659 "superblock": true, 00:23:17.659 "num_base_bdevs": 2, 00:23:17.659 "num_base_bdevs_discovered": 1, 00:23:17.659 "num_base_bdevs_operational": 1, 00:23:17.659 "base_bdevs_list": [ 00:23:17.659 { 00:23:17.659 "name": null, 00:23:17.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:17.659 "is_configured": false, 00:23:17.659 "data_offset": 2048, 00:23:17.659 "data_size": 63488 00:23:17.659 }, 00:23:17.659 { 00:23:17.659 "name": "BaseBdev2", 00:23:17.659 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:17.659 "is_configured": true, 00:23:17.659 "data_offset": 2048, 00:23:17.659 "data_size": 63488 00:23:17.659 } 00:23:17.659 ] 00:23:17.659 }' 00:23:17.659 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:17.659 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.226 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.485 "name": "raid_bdev1", 00:23:18.485 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:18.485 "strip_size_kb": 0, 00:23:18.485 "state": "online", 00:23:18.485 "raid_level": "raid1", 00:23:18.485 "superblock": true, 00:23:18.485 "num_base_bdevs": 2, 00:23:18.485 "num_base_bdevs_discovered": 1, 00:23:18.485 "num_base_bdevs_operational": 1, 00:23:18.485 "base_bdevs_list": [ 00:23:18.485 { 00:23:18.485 "name": null, 00:23:18.485 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:18.485 "is_configured": false, 00:23:18.485 "data_offset": 2048, 00:23:18.485 "data_size": 63488 00:23:18.485 }, 00:23:18.485 { 00:23:18.485 "name": "BaseBdev2", 00:23:18.485 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:18.485 "is_configured": true, 00:23:18.485 "data_offset": 2048, 00:23:18.485 "data_size": 63488 00:23:18.485 } 00:23:18.485 ] 00:23:18.485 }' 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:18.485 15:59:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:18.743 [2024-07-12 15:59:39.030388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:18.743 [2024-07-12 15:59:39.030472] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:18.743 [2024-07-12 15:59:39.030479] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:18.743 request: 00:23:18.743 { 00:23:18.743 "base_bdev": "BaseBdev1", 00:23:18.743 "raid_bdev": "raid_bdev1", 00:23:18.743 "method": "bdev_raid_add_base_bdev", 00:23:18.743 "req_id": 1 00:23:18.743 } 00:23:18.743 Got JSON-RPC error response 00:23:18.743 response: 00:23:18.743 { 00:23:18.743 "code": -22, 00:23:18.743 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:18.743 } 00:23:18.743 15:59:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:23:18.743 15:59:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:18.743 15:59:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:18.743 15:59:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:18.743 15:59:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.681 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:19.940 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:19.940 "name": "raid_bdev1", 00:23:19.940 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:19.940 "strip_size_kb": 0, 00:23:19.941 "state": "online", 00:23:19.941 "raid_level": "raid1", 00:23:19.941 "superblock": true, 00:23:19.941 "num_base_bdevs": 2, 00:23:19.941 "num_base_bdevs_discovered": 1, 00:23:19.941 "num_base_bdevs_operational": 1, 00:23:19.941 "base_bdevs_list": [ 00:23:19.941 { 00:23:19.941 "name": null, 00:23:19.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:19.941 "is_configured": false, 00:23:19.941 "data_offset": 2048, 00:23:19.941 "data_size": 63488 00:23:19.941 }, 00:23:19.941 { 00:23:19.941 "name": "BaseBdev2", 00:23:19.941 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:19.941 "is_configured": true, 00:23:19.941 "data_offset": 2048, 00:23:19.941 "data_size": 63488 00:23:19.941 } 00:23:19.941 ] 00:23:19.941 }' 00:23:19.941 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:19.941 15:59:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.511 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.771 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.771 "name": "raid_bdev1", 00:23:20.771 "uuid": "f40d4d0e-cd79-4267-896a-b7d98f744669", 00:23:20.771 "strip_size_kb": 0, 00:23:20.771 "state": "online", 00:23:20.771 "raid_level": "raid1", 00:23:20.771 "superblock": true, 00:23:20.771 "num_base_bdevs": 2, 00:23:20.771 "num_base_bdevs_discovered": 1, 00:23:20.771 "num_base_bdevs_operational": 1, 00:23:20.771 "base_bdevs_list": [ 00:23:20.771 { 00:23:20.771 "name": null, 00:23:20.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.772 "is_configured": false, 00:23:20.772 "data_offset": 2048, 00:23:20.772 "data_size": 63488 00:23:20.772 }, 00:23:20.772 { 00:23:20.772 "name": "BaseBdev2", 00:23:20.772 "uuid": "11bc7bfa-2916-5da8-af29-a23797e6c380", 00:23:20.772 "is_configured": true, 00:23:20.772 "data_offset": 2048, 00:23:20.772 "data_size": 63488 00:23:20.772 } 00:23:20.772 ] 00:23:20.772 }' 00:23:20.772 15:59:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2622158 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2622158 ']' 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2622158 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2622158 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2622158' 00:23:20.772 killing process with pid 2622158 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2622158 00:23:20.772 Received shutdown signal, test time was about 60.000000 seconds 00:23:20.772 00:23:20.772 Latency(us) 00:23:20.772 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:20.772 =================================================================================================================== 00:23:20.772 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:20.772 [2024-07-12 15:59:41.112993] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:20.772 [2024-07-12 15:59:41.113058] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:20.772 [2024-07-12 15:59:41.113088] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:20.772 [2024-07-12 15:59:41.113094] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1051e10 name raid_bdev1, state offline 00:23:20.772 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2622158 00:23:20.772 [2024-07-12 15:59:41.127954] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:23:21.032 00:23:21.032 real 0m35.223s 00:23:21.032 user 0m53.521s 00:23:21.032 sys 0m4.721s 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:21.032 ************************************ 00:23:21.032 END TEST raid_rebuild_test_sb 00:23:21.032 ************************************ 00:23:21.032 15:59:41 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:21.032 15:59:41 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:21.032 15:59:41 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:21.032 15:59:41 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:21.032 15:59:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:21.032 ************************************ 00:23:21.032 START TEST raid_rebuild_test_io 00:23:21.032 ************************************ 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:21.032 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2628502 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2628502 /var/tmp/spdk-raid.sock 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2628502 ']' 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:21.033 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:21.033 15:59:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:21.033 [2024-07-12 15:59:41.384421] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:23:21.033 [2024-07-12 15:59:41.384472] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2628502 ] 00:23:21.033 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:21.033 Zero copy mechanism will not be used. 00:23:21.033 [2024-07-12 15:59:41.473448] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:21.292 [2024-07-12 15:59:41.541333] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:21.292 [2024-07-12 15:59:41.584180] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:21.292 [2024-07-12 15:59:41.584203] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:22.230 15:59:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:22.231 15:59:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:23:22.231 15:59:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:22.231 15:59:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:22.850 BaseBdev1_malloc 00:23:22.850 15:59:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:23.422 [2024-07-12 15:59:43.624925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:23.422 [2024-07-12 15:59:43.624962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:23.422 [2024-07-12 15:59:43.624977] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x974010 00:23:23.422 [2024-07-12 15:59:43.624983] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:23.422 [2024-07-12 15:59:43.626285] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:23.422 [2024-07-12 15:59:43.626304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:23.422 BaseBdev1 00:23:23.422 15:59:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:23.422 15:59:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:23.992 BaseBdev2_malloc 00:23:23.992 15:59:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:24.561 [2024-07-12 15:59:44.709596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:24.561 [2024-07-12 15:59:44.709628] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.561 [2024-07-12 15:59:44.709641] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x974c30 00:23:24.561 [2024-07-12 15:59:44.709648] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.561 [2024-07-12 15:59:44.710831] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.561 [2024-07-12 15:59:44.710850] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:24.561 BaseBdev2 00:23:24.561 15:59:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:24.820 spare_malloc 00:23:25.080 15:59:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:25.648 spare_delay 00:23:25.648 15:59:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:25.908 [2024-07-12 15:59:46.335522] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:25.908 [2024-07-12 15:59:46.335553] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.908 [2024-07-12 15:59:46.335565] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb257f0 00:23:25.908 [2024-07-12 15:59:46.335571] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.908 [2024-07-12 15:59:46.336779] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.908 [2024-07-12 15:59:46.336803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:25.908 spare 00:23:26.168 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:26.738 [2024-07-12 15:59:46.876906] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:26.738 [2024-07-12 15:59:46.877921] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:26.738 [2024-07-12 15:59:46.877981] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x96cd30 00:23:26.738 [2024-07-12 15:59:46.877987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:26.738 [2024-07-12 15:59:46.878138] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb177d0 00:23:26.738 [2024-07-12 15:59:46.878248] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x96cd30 00:23:26.738 [2024-07-12 15:59:46.878254] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x96cd30 00:23:26.738 [2024-07-12 15:59:46.878335] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.738 15:59:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.308 15:59:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.308 "name": "raid_bdev1", 00:23:27.308 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:27.308 "strip_size_kb": 0, 00:23:27.308 "state": "online", 00:23:27.308 "raid_level": "raid1", 00:23:27.308 "superblock": false, 00:23:27.308 "num_base_bdevs": 2, 00:23:27.308 "num_base_bdevs_discovered": 2, 00:23:27.308 "num_base_bdevs_operational": 2, 00:23:27.308 "base_bdevs_list": [ 00:23:27.308 { 00:23:27.308 "name": "BaseBdev1", 00:23:27.308 "uuid": "e7d323cd-912a-5ff1-840c-408e8330a96c", 00:23:27.308 "is_configured": true, 00:23:27.308 "data_offset": 0, 00:23:27.308 "data_size": 65536 00:23:27.308 }, 00:23:27.308 { 00:23:27.308 "name": "BaseBdev2", 00:23:27.308 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:27.308 "is_configured": true, 00:23:27.308 "data_offset": 0, 00:23:27.308 "data_size": 65536 00:23:27.308 } 00:23:27.308 ] 00:23:27.308 }' 00:23:27.308 15:59:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.308 15:59:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:27.568 15:59:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:27.568 15:59:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:27.828 [2024-07-12 15:59:48.136288] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:27.828 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:27.828 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.828 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:28.087 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:28.087 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:28.087 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:28.087 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:28.087 [2024-07-12 15:59:48.434225] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x96c760 00:23:28.087 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:28.087 Zero copy mechanism will not be used. 00:23:28.087 Running I/O for 60 seconds... 00:23:28.087 [2024-07-12 15:59:48.530069] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:28.347 [2024-07-12 15:59:48.536623] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x96c760 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.347 15:59:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.916 15:59:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.916 "name": "raid_bdev1", 00:23:28.916 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:28.916 "strip_size_kb": 0, 00:23:28.916 "state": "online", 00:23:28.916 "raid_level": "raid1", 00:23:28.916 "superblock": false, 00:23:28.916 "num_base_bdevs": 2, 00:23:28.916 "num_base_bdevs_discovered": 1, 00:23:28.916 "num_base_bdevs_operational": 1, 00:23:28.916 "base_bdevs_list": [ 00:23:28.916 { 00:23:28.916 "name": null, 00:23:28.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.916 "is_configured": false, 00:23:28.916 "data_offset": 0, 00:23:28.916 "data_size": 65536 00:23:28.916 }, 00:23:28.916 { 00:23:28.916 "name": "BaseBdev2", 00:23:28.916 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:28.916 "is_configured": true, 00:23:28.916 "data_offset": 0, 00:23:28.916 "data_size": 65536 00:23:28.916 } 00:23:28.916 ] 00:23:28.916 }' 00:23:28.916 15:59:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.916 15:59:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:29.855 15:59:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:29.855 [2024-07-12 15:59:50.222862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:29.855 15:59:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:29.855 [2024-07-12 15:59:50.281483] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x679ce0 00:23:29.855 [2024-07-12 15:59:50.283238] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:30.114 [2024-07-12 15:59:50.397694] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:30.114 [2024-07-12 15:59:50.397914] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:30.375 [2024-07-12 15:59:50.619522] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:30.375 [2024-07-12 15:59:50.619652] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:30.635 [2024-07-12 15:59:50.957340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:30.635 [2024-07-12 15:59:50.957615] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:30.894 [2024-07-12 15:59:51.166464] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:30.894 [2024-07-12 15:59:51.166575] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.894 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:31.154 "name": "raid_bdev1", 00:23:31.154 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:31.154 "strip_size_kb": 0, 00:23:31.154 "state": "online", 00:23:31.154 "raid_level": "raid1", 00:23:31.154 "superblock": false, 00:23:31.154 "num_base_bdevs": 2, 00:23:31.154 "num_base_bdevs_discovered": 2, 00:23:31.154 "num_base_bdevs_operational": 2, 00:23:31.154 "process": { 00:23:31.154 "type": "rebuild", 00:23:31.154 "target": "spare", 00:23:31.154 "progress": { 00:23:31.154 "blocks": 12288, 00:23:31.154 "percent": 18 00:23:31.154 } 00:23:31.154 }, 00:23:31.154 "base_bdevs_list": [ 00:23:31.154 { 00:23:31.154 "name": "spare", 00:23:31.154 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:31.154 "is_configured": true, 00:23:31.154 "data_offset": 0, 00:23:31.154 "data_size": 65536 00:23:31.154 }, 00:23:31.154 { 00:23:31.154 "name": "BaseBdev2", 00:23:31.154 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:31.154 "is_configured": true, 00:23:31.154 "data_offset": 0, 00:23:31.154 "data_size": 65536 00:23:31.154 } 00:23:31.154 ] 00:23:31.154 }' 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:31.154 [2024-07-12 15:59:51.484953] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:31.154 15:59:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:31.414 [2024-07-12 15:59:51.706855] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:31.673 [2024-07-12 15:59:52.016833] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:31.673 [2024-07-12 15:59:52.108263] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.933 [2024-07-12 15:59:52.132010] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:31.933 [2024-07-12 15:59:52.239315] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:31.933 [2024-07-12 15:59:52.240498] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:31.933 [2024-07-12 15:59:52.240516] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:31.933 [2024-07-12 15:59:52.240521] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:31.933 [2024-07-12 15:59:52.257235] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x96c760 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.933 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.501 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:32.501 "name": "raid_bdev1", 00:23:32.501 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:32.501 "strip_size_kb": 0, 00:23:32.501 "state": "online", 00:23:32.501 "raid_level": "raid1", 00:23:32.501 "superblock": false, 00:23:32.501 "num_base_bdevs": 2, 00:23:32.501 "num_base_bdevs_discovered": 1, 00:23:32.501 "num_base_bdevs_operational": 1, 00:23:32.501 "base_bdevs_list": [ 00:23:32.501 { 00:23:32.501 "name": null, 00:23:32.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:32.501 "is_configured": false, 00:23:32.501 "data_offset": 0, 00:23:32.501 "data_size": 65536 00:23:32.501 }, 00:23:32.501 { 00:23:32.501 "name": "BaseBdev2", 00:23:32.502 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:32.502 "is_configured": true, 00:23:32.502 "data_offset": 0, 00:23:32.502 "data_size": 65536 00:23:32.502 } 00:23:32.502 ] 00:23:32.502 }' 00:23:32.502 15:59:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:32.502 15:59:52 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.070 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.330 "name": "raid_bdev1", 00:23:33.330 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:33.330 "strip_size_kb": 0, 00:23:33.330 "state": "online", 00:23:33.330 "raid_level": "raid1", 00:23:33.330 "superblock": false, 00:23:33.330 "num_base_bdevs": 2, 00:23:33.330 "num_base_bdevs_discovered": 1, 00:23:33.330 "num_base_bdevs_operational": 1, 00:23:33.330 "base_bdevs_list": [ 00:23:33.330 { 00:23:33.330 "name": null, 00:23:33.330 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:33.330 "is_configured": false, 00:23:33.330 "data_offset": 0, 00:23:33.330 "data_size": 65536 00:23:33.330 }, 00:23:33.330 { 00:23:33.330 "name": "BaseBdev2", 00:23:33.330 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:33.330 "is_configured": true, 00:23:33.330 "data_offset": 0, 00:23:33.330 "data_size": 65536 00:23:33.330 } 00:23:33.330 ] 00:23:33.330 }' 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:33.330 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:33.589 [2024-07-12 15:59:53.879322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:33.589 15:59:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:33.589 [2024-07-12 15:59:53.925029] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x973720 00:23:33.589 [2024-07-12 15:59:53.926156] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:33.849 [2024-07-12 15:59:54.047403] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:33.849 [2024-07-12 15:59:54.047601] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:33.849 [2024-07-12 15:59:54.269310] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:33.849 [2024-07-12 15:59:54.269414] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:34.419 [2024-07-12 15:59:54.722614] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:34.419 [2024-07-12 15:59:54.722786] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.679 15:59:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.679 [2024-07-12 15:59:55.088538] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:34.679 [2024-07-12 15:59:55.088803] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:34.679 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:34.679 "name": "raid_bdev1", 00:23:34.679 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:34.679 "strip_size_kb": 0, 00:23:34.679 "state": "online", 00:23:34.679 "raid_level": "raid1", 00:23:34.679 "superblock": false, 00:23:34.679 "num_base_bdevs": 2, 00:23:34.679 "num_base_bdevs_discovered": 2, 00:23:34.679 "num_base_bdevs_operational": 2, 00:23:34.679 "process": { 00:23:34.679 "type": "rebuild", 00:23:34.679 "target": "spare", 00:23:34.679 "progress": { 00:23:34.679 "blocks": 14336, 00:23:34.679 "percent": 21 00:23:34.679 } 00:23:34.679 }, 00:23:34.679 "base_bdevs_list": [ 00:23:34.679 { 00:23:34.679 "name": "spare", 00:23:34.679 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:34.679 "is_configured": true, 00:23:34.679 "data_offset": 0, 00:23:34.679 "data_size": 65536 00:23:34.679 }, 00:23:34.679 { 00:23:34.679 "name": "BaseBdev2", 00:23:34.679 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:34.679 "is_configured": true, 00:23:34.679 "data_offset": 0, 00:23:34.679 "data_size": 65536 00:23:34.679 } 00:23:34.679 ] 00:23:34.679 }' 00:23:34.679 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=741 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.939 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.939 [2024-07-12 15:59:55.317811] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:34.939 [2024-07-12 15:59:55.317963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.198 "name": "raid_bdev1", 00:23:35.198 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:35.198 "strip_size_kb": 0, 00:23:35.198 "state": "online", 00:23:35.198 "raid_level": "raid1", 00:23:35.198 "superblock": false, 00:23:35.198 "num_base_bdevs": 2, 00:23:35.198 "num_base_bdevs_discovered": 2, 00:23:35.198 "num_base_bdevs_operational": 2, 00:23:35.198 "process": { 00:23:35.198 "type": "rebuild", 00:23:35.198 "target": "spare", 00:23:35.198 "progress": { 00:23:35.198 "blocks": 16384, 00:23:35.198 "percent": 25 00:23:35.198 } 00:23:35.198 }, 00:23:35.198 "base_bdevs_list": [ 00:23:35.198 { 00:23:35.198 "name": "spare", 00:23:35.198 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:35.198 "is_configured": true, 00:23:35.198 "data_offset": 0, 00:23:35.198 "data_size": 65536 00:23:35.198 }, 00:23:35.198 { 00:23:35.198 "name": "BaseBdev2", 00:23:35.198 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:35.198 "is_configured": true, 00:23:35.198 "data_offset": 0, 00:23:35.198 "data_size": 65536 00:23:35.198 } 00:23:35.198 ] 00:23:35.198 }' 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.198 15:59:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:35.458 [2024-07-12 15:59:55.662261] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:35.458 [2024-07-12 15:59:55.885278] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:36.027 [2024-07-12 15:59:56.215629] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.287 "name": "raid_bdev1", 00:23:36.287 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:36.287 "strip_size_kb": 0, 00:23:36.287 "state": "online", 00:23:36.287 "raid_level": "raid1", 00:23:36.287 "superblock": false, 00:23:36.287 "num_base_bdevs": 2, 00:23:36.287 "num_base_bdevs_discovered": 2, 00:23:36.287 "num_base_bdevs_operational": 2, 00:23:36.287 "process": { 00:23:36.287 "type": "rebuild", 00:23:36.287 "target": "spare", 00:23:36.287 "progress": { 00:23:36.287 "blocks": 30720, 00:23:36.287 "percent": 46 00:23:36.287 } 00:23:36.287 }, 00:23:36.287 "base_bdevs_list": [ 00:23:36.287 { 00:23:36.287 "name": "spare", 00:23:36.287 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:36.287 "is_configured": true, 00:23:36.287 "data_offset": 0, 00:23:36.287 "data_size": 65536 00:23:36.287 }, 00:23:36.287 { 00:23:36.287 "name": "BaseBdev2", 00:23:36.287 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:36.287 "is_configured": true, 00:23:36.287 "data_offset": 0, 00:23:36.287 "data_size": 65536 00:23:36.287 } 00:23:36.287 ] 00:23:36.287 }' 00:23:36.287 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.287 [2024-07-12 15:59:56.716271] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:36.547 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:36.547 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.547 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:36.547 15:59:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:37.116 [2024-07-12 15:59:57.261316] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:37.375 [2024-07-12 15:59:57.590828] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:37.375 [2024-07-12 15:59:57.591075] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.375 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:37.375 [2024-07-12 15:59:57.814393] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:23:37.634 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:37.634 "name": "raid_bdev1", 00:23:37.634 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:37.634 "strip_size_kb": 0, 00:23:37.634 "state": "online", 00:23:37.634 "raid_level": "raid1", 00:23:37.634 "superblock": false, 00:23:37.634 "num_base_bdevs": 2, 00:23:37.634 "num_base_bdevs_discovered": 2, 00:23:37.634 "num_base_bdevs_operational": 2, 00:23:37.634 "process": { 00:23:37.634 "type": "rebuild", 00:23:37.634 "target": "spare", 00:23:37.634 "progress": { 00:23:37.634 "blocks": 47104, 00:23:37.634 "percent": 71 00:23:37.634 } 00:23:37.634 }, 00:23:37.634 "base_bdevs_list": [ 00:23:37.634 { 00:23:37.634 "name": "spare", 00:23:37.634 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:37.634 "is_configured": true, 00:23:37.634 "data_offset": 0, 00:23:37.634 "data_size": 65536 00:23:37.634 }, 00:23:37.634 { 00:23:37.634 "name": "BaseBdev2", 00:23:37.634 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:37.634 "is_configured": true, 00:23:37.634 "data_offset": 0, 00:23:37.634 "data_size": 65536 00:23:37.634 } 00:23:37.634 ] 00:23:37.634 }' 00:23:37.634 15:59:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:37.634 15:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:37.634 15:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:37.634 15:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:37.634 15:59:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:37.894 [2024-07-12 15:59:58.157015] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:38.832 [2024-07-12 15:59:59.027069] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:38.832 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:38.832 [2024-07-12 15:59:59.133791] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:38.832 [2024-07-12 15:59:59.134826] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.092 "name": "raid_bdev1", 00:23:39.092 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:39.092 "strip_size_kb": 0, 00:23:39.092 "state": "online", 00:23:39.092 "raid_level": "raid1", 00:23:39.092 "superblock": false, 00:23:39.092 "num_base_bdevs": 2, 00:23:39.092 "num_base_bdevs_discovered": 2, 00:23:39.092 "num_base_bdevs_operational": 2, 00:23:39.092 "base_bdevs_list": [ 00:23:39.092 { 00:23:39.092 "name": "spare", 00:23:39.092 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:39.092 "is_configured": true, 00:23:39.092 "data_offset": 0, 00:23:39.092 "data_size": 65536 00:23:39.092 }, 00:23:39.092 { 00:23:39.092 "name": "BaseBdev2", 00:23:39.092 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:39.092 "is_configured": true, 00:23:39.092 "data_offset": 0, 00:23:39.092 "data_size": 65536 00:23:39.092 } 00:23:39.092 ] 00:23:39.092 }' 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.092 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.352 "name": "raid_bdev1", 00:23:39.352 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:39.352 "strip_size_kb": 0, 00:23:39.352 "state": "online", 00:23:39.352 "raid_level": "raid1", 00:23:39.352 "superblock": false, 00:23:39.352 "num_base_bdevs": 2, 00:23:39.352 "num_base_bdevs_discovered": 2, 00:23:39.352 "num_base_bdevs_operational": 2, 00:23:39.352 "base_bdevs_list": [ 00:23:39.352 { 00:23:39.352 "name": "spare", 00:23:39.352 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:39.352 "is_configured": true, 00:23:39.352 "data_offset": 0, 00:23:39.352 "data_size": 65536 00:23:39.352 }, 00:23:39.352 { 00:23:39.352 "name": "BaseBdev2", 00:23:39.352 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:39.352 "is_configured": true, 00:23:39.352 "data_offset": 0, 00:23:39.352 "data_size": 65536 00:23:39.352 } 00:23:39.352 ] 00:23:39.352 }' 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.352 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.611 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.611 "name": "raid_bdev1", 00:23:39.611 "uuid": "e66af135-808d-4fb5-943e-0d8441c75c5d", 00:23:39.611 "strip_size_kb": 0, 00:23:39.611 "state": "online", 00:23:39.611 "raid_level": "raid1", 00:23:39.611 "superblock": false, 00:23:39.611 "num_base_bdevs": 2, 00:23:39.611 "num_base_bdevs_discovered": 2, 00:23:39.611 "num_base_bdevs_operational": 2, 00:23:39.611 "base_bdevs_list": [ 00:23:39.611 { 00:23:39.611 "name": "spare", 00:23:39.611 "uuid": "82d93666-5b34-5906-98d3-1597e9f9cab7", 00:23:39.611 "is_configured": true, 00:23:39.611 "data_offset": 0, 00:23:39.611 "data_size": 65536 00:23:39.611 }, 00:23:39.611 { 00:23:39.611 "name": "BaseBdev2", 00:23:39.611 "uuid": "7021f6d1-e5ea-577d-91ca-c591d5b7e39c", 00:23:39.611 "is_configured": true, 00:23:39.611 "data_offset": 0, 00:23:39.611 "data_size": 65536 00:23:39.611 } 00:23:39.611 ] 00:23:39.611 }' 00:23:39.611 15:59:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.612 15:59:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:40.181 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:40.181 [2024-07-12 16:00:00.551049] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:40.181 [2024-07-12 16:00:00.551071] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:40.441 00:23:40.441 Latency(us) 00:23:40.441 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:40.441 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:40.441 raid_bdev1 : 12.18 129.86 389.57 0.00 0.00 10210.26 242.61 114536.76 00:23:40.441 =================================================================================================================== 00:23:40.441 Total : 129.86 389.57 0.00 0.00 10210.26 242.61 114536.76 00:23:40.441 [2024-07-12 16:00:00.650504] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:40.441 [2024-07-12 16:00:00.650527] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:40.441 [2024-07-12 16:00:00.650579] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:40.441 [2024-07-12 16:00:00.650586] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x96cd30 name raid_bdev1, state offline 00:23:40.441 0 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.441 16:00:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:40.701 /dev/nbd0 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:40.701 1+0 records in 00:23:40.701 1+0 records out 00:23:40.701 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284887 s, 14.4 MB/s 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.701 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:40.961 /dev/nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:40.961 1+0 records in 00:23:40.961 1+0 records out 00:23:40.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277948 s, 14.7 MB/s 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:40.961 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:41.531 16:00:01 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2628502 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2628502 ']' 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2628502 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2628502 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2628502' 00:23:41.789 killing process with pid 2628502 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2628502 00:23:41.789 Received shutdown signal, test time was about 13.746826 seconds 00:23:41.789 00:23:41.789 Latency(us) 00:23:41.789 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:41.789 =================================================================================================================== 00:23:41.789 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:41.789 [2024-07-12 16:00:02.214990] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:41.789 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2628502 00:23:41.789 [2024-07-12 16:00:02.226489] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:42.048 00:23:42.048 real 0m21.026s 00:23:42.048 user 0m33.968s 00:23:42.048 sys 0m2.414s 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:42.048 ************************************ 00:23:42.048 END TEST raid_rebuild_test_io 00:23:42.048 ************************************ 00:23:42.048 16:00:02 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:42.048 16:00:02 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:42.048 16:00:02 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:23:42.048 16:00:02 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:42.048 16:00:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:42.048 ************************************ 00:23:42.048 START TEST raid_rebuild_test_sb_io 00:23:42.048 ************************************ 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:23:42.048 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2632412 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2632412 /var/tmp/spdk-raid.sock 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2632412 ']' 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:42.049 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:42.049 16:00:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:42.049 [2024-07-12 16:00:02.490804] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:23:42.049 [2024-07-12 16:00:02.490854] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2632412 ] 00:23:42.049 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:42.049 Zero copy mechanism will not be used. 00:23:42.329 [2024-07-12 16:00:02.580216] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:42.329 [2024-07-12 16:00:02.657198] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:42.329 [2024-07-12 16:00:02.705147] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:42.329 [2024-07-12 16:00:02.705175] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:42.907 16:00:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:42.907 16:00:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:23:42.907 16:00:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:42.908 16:00:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:43.478 BaseBdev1_malloc 00:23:43.479 16:00:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:43.738 [2024-07-12 16:00:04.077889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:43.738 [2024-07-12 16:00:04.077923] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.738 [2024-07-12 16:00:04.077939] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2521010 00:23:43.738 [2024-07-12 16:00:04.077946] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.738 [2024-07-12 16:00:04.079272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.738 [2024-07-12 16:00:04.079292] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:43.738 BaseBdev1 00:23:43.738 16:00:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:43.738 16:00:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:43.997 BaseBdev2_malloc 00:23:43.997 16:00:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:44.568 [2024-07-12 16:00:04.797686] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:44.568 [2024-07-12 16:00:04.797720] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.568 [2024-07-12 16:00:04.797735] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2521c30 00:23:44.568 [2024-07-12 16:00:04.797741] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.569 [2024-07-12 16:00:04.798940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.569 [2024-07-12 16:00:04.798959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:44.569 BaseBdev2 00:23:44.569 16:00:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:44.569 spare_malloc 00:23:44.829 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:45.399 spare_delay 00:23:45.399 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:45.399 [2024-07-12 16:00:05.741988] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:45.399 [2024-07-12 16:00:05.742017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.399 [2024-07-12 16:00:05.742029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26d27f0 00:23:45.399 [2024-07-12 16:00:05.742035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.399 [2024-07-12 16:00:05.743239] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.399 [2024-07-12 16:00:05.743258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:45.399 spare 00:23:45.399 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:45.660 [2024-07-12 16:00:05.934493] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:45.660 [2024-07-12 16:00:05.935497] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:45.660 [2024-07-12 16:00:05.935615] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2519d30 00:23:45.660 [2024-07-12 16:00:05.935623] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:45.660 [2024-07-12 16:00:05.935778] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25199c0 00:23:45.660 [2024-07-12 16:00:05.935888] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2519d30 00:23:45.660 [2024-07-12 16:00:05.935893] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2519d30 00:23:45.660 [2024-07-12 16:00:05.935964] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.660 16:00:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.920 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.920 "name": "raid_bdev1", 00:23:45.920 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:45.920 "strip_size_kb": 0, 00:23:45.920 "state": "online", 00:23:45.920 "raid_level": "raid1", 00:23:45.920 "superblock": true, 00:23:45.920 "num_base_bdevs": 2, 00:23:45.920 "num_base_bdevs_discovered": 2, 00:23:45.920 "num_base_bdevs_operational": 2, 00:23:45.920 "base_bdevs_list": [ 00:23:45.920 { 00:23:45.920 "name": "BaseBdev1", 00:23:45.920 "uuid": "07a04edb-edf9-5aea-a9ec-09e1873f7d2e", 00:23:45.920 "is_configured": true, 00:23:45.920 "data_offset": 2048, 00:23:45.920 "data_size": 63488 00:23:45.920 }, 00:23:45.920 { 00:23:45.920 "name": "BaseBdev2", 00:23:45.920 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:45.920 "is_configured": true, 00:23:45.920 "data_offset": 2048, 00:23:45.920 "data_size": 63488 00:23:45.920 } 00:23:45.920 ] 00:23:45.920 }' 00:23:45.920 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.920 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:46.489 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:46.489 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:46.489 [2024-07-12 16:00:06.881065] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:46.489 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:46.489 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.489 16:00:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:46.749 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:46.749 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:23:46.749 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:46.749 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:46.749 [2024-07-12 16:00:07.187054] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25186c0 00:23:46.749 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:46.749 Zero copy mechanism will not be used. 00:23:46.749 Running I/O for 60 seconds... 00:23:47.008 [2024-07-12 16:00:07.280484] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:47.008 [2024-07-12 16:00:07.280654] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25186c0 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.008 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.009 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.009 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.268 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.268 "name": "raid_bdev1", 00:23:47.268 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:47.268 "strip_size_kb": 0, 00:23:47.268 "state": "online", 00:23:47.268 "raid_level": "raid1", 00:23:47.268 "superblock": true, 00:23:47.268 "num_base_bdevs": 2, 00:23:47.268 "num_base_bdevs_discovered": 1, 00:23:47.268 "num_base_bdevs_operational": 1, 00:23:47.268 "base_bdevs_list": [ 00:23:47.268 { 00:23:47.268 "name": null, 00:23:47.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.268 "is_configured": false, 00:23:47.268 "data_offset": 2048, 00:23:47.268 "data_size": 63488 00:23:47.268 }, 00:23:47.268 { 00:23:47.268 "name": "BaseBdev2", 00:23:47.268 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:47.268 "is_configured": true, 00:23:47.268 "data_offset": 2048, 00:23:47.268 "data_size": 63488 00:23:47.268 } 00:23:47.268 ] 00:23:47.268 }' 00:23:47.268 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.268 16:00:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:47.845 16:00:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:47.846 [2024-07-12 16:00:08.227905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:47.846 16:00:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:47.846 [2024-07-12 16:00:08.260992] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2226ce0 00:23:47.846 [2024-07-12 16:00:08.262605] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:48.109 [2024-07-12 16:00:08.367951] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:48.109 [2024-07-12 16:00:08.368210] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:48.376 [2024-07-12 16:00:08.576687] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:48.376 [2024-07-12 16:00:08.576800] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:48.643 [2024-07-12 16:00:08.913061] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:48.644 [2024-07-12 16:00:08.913279] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:48.915 [2024-07-12 16:00:09.121932] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.915 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.181 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:49.182 "name": "raid_bdev1", 00:23:49.182 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:49.182 "strip_size_kb": 0, 00:23:49.182 "state": "online", 00:23:49.182 "raid_level": "raid1", 00:23:49.182 "superblock": true, 00:23:49.182 "num_base_bdevs": 2, 00:23:49.182 "num_base_bdevs_discovered": 2, 00:23:49.182 "num_base_bdevs_operational": 2, 00:23:49.182 "process": { 00:23:49.182 "type": "rebuild", 00:23:49.182 "target": "spare", 00:23:49.182 "progress": { 00:23:49.182 "blocks": 12288, 00:23:49.182 "percent": 19 00:23:49.182 } 00:23:49.182 }, 00:23:49.182 "base_bdevs_list": [ 00:23:49.182 { 00:23:49.182 "name": "spare", 00:23:49.182 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:49.182 "is_configured": true, 00:23:49.182 "data_offset": 2048, 00:23:49.182 "data_size": 63488 00:23:49.182 }, 00:23:49.182 { 00:23:49.182 "name": "BaseBdev2", 00:23:49.182 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:49.182 "is_configured": true, 00:23:49.182 "data_offset": 2048, 00:23:49.182 "data_size": 63488 00:23:49.182 } 00:23:49.182 ] 00:23:49.182 }' 00:23:49.182 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:49.182 [2024-07-12 16:00:09.466153] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:49.182 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:49.182 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:49.182 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:49.182 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:49.442 [2024-07-12 16:00:09.688196] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:49.442 [2024-07-12 16:00:09.688344] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:49.442 [2024-07-12 16:00:09.723322] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:49.442 [2024-07-12 16:00:09.803486] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:49.704 [2024-07-12 16:00:09.910233] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:49.704 [2024-07-12 16:00:09.924574] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:49.704 [2024-07-12 16:00:09.924591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:49.704 [2024-07-12 16:00:09.924597] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:49.704 [2024-07-12 16:00:09.954413] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x25186c0 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.704 16:00:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:49.972 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:49.972 "name": "raid_bdev1", 00:23:49.972 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:49.972 "strip_size_kb": 0, 00:23:49.972 "state": "online", 00:23:49.972 "raid_level": "raid1", 00:23:49.972 "superblock": true, 00:23:49.972 "num_base_bdevs": 2, 00:23:49.972 "num_base_bdevs_discovered": 1, 00:23:49.972 "num_base_bdevs_operational": 1, 00:23:49.972 "base_bdevs_list": [ 00:23:49.972 { 00:23:49.972 "name": null, 00:23:49.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:49.972 "is_configured": false, 00:23:49.972 "data_offset": 2048, 00:23:49.972 "data_size": 63488 00:23:49.972 }, 00:23:49.972 { 00:23:49.972 "name": "BaseBdev2", 00:23:49.972 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:49.972 "is_configured": true, 00:23:49.972 "data_offset": 2048, 00:23:49.972 "data_size": 63488 00:23:49.972 } 00:23:49.972 ] 00:23:49.972 }' 00:23:49.972 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:49.972 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.559 "name": "raid_bdev1", 00:23:50.559 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:50.559 "strip_size_kb": 0, 00:23:50.559 "state": "online", 00:23:50.559 "raid_level": "raid1", 00:23:50.559 "superblock": true, 00:23:50.559 "num_base_bdevs": 2, 00:23:50.559 "num_base_bdevs_discovered": 1, 00:23:50.559 "num_base_bdevs_operational": 1, 00:23:50.559 "base_bdevs_list": [ 00:23:50.559 { 00:23:50.559 "name": null, 00:23:50.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.559 "is_configured": false, 00:23:50.559 "data_offset": 2048, 00:23:50.559 "data_size": 63488 00:23:50.559 }, 00:23:50.559 { 00:23:50.559 "name": "BaseBdev2", 00:23:50.559 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:50.559 "is_configured": true, 00:23:50.559 "data_offset": 2048, 00:23:50.559 "data_size": 63488 00:23:50.559 } 00:23:50.559 ] 00:23:50.559 }' 00:23:50.559 16:00:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.821 16:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:50.821 16:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.822 16:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:50.822 16:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:50.822 [2024-07-12 16:00:11.239275] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:51.089 16:00:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:51.090 [2024-07-12 16:00:11.311219] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c8bd0 00:23:51.090 [2024-07-12 16:00:11.312356] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:51.090 [2024-07-12 16:00:11.414193] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:51.090 [2024-07-12 16:00:11.414394] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:51.353 [2024-07-12 16:00:11.660697] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:51.617 [2024-07-12 16:00:12.011340] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:51.888 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.148 [2024-07-12 16:00:12.358156] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:52.148 [2024-07-12 16:00:12.358375] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:52.148 [2024-07-12 16:00:12.474789] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.148 "name": "raid_bdev1", 00:23:52.148 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:52.148 "strip_size_kb": 0, 00:23:52.148 "state": "online", 00:23:52.148 "raid_level": "raid1", 00:23:52.148 "superblock": true, 00:23:52.148 "num_base_bdevs": 2, 00:23:52.148 "num_base_bdevs_discovered": 2, 00:23:52.148 "num_base_bdevs_operational": 2, 00:23:52.148 "process": { 00:23:52.148 "type": "rebuild", 00:23:52.148 "target": "spare", 00:23:52.148 "progress": { 00:23:52.148 "blocks": 14336, 00:23:52.148 "percent": 22 00:23:52.148 } 00:23:52.148 }, 00:23:52.148 "base_bdevs_list": [ 00:23:52.148 { 00:23:52.148 "name": "spare", 00:23:52.148 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:52.148 "is_configured": true, 00:23:52.148 "data_offset": 2048, 00:23:52.148 "data_size": 63488 00:23:52.148 }, 00:23:52.148 { 00:23:52.148 "name": "BaseBdev2", 00:23:52.148 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:52.148 "is_configured": true, 00:23:52.148 "data_offset": 2048, 00:23:52.148 "data_size": 63488 00:23:52.148 } 00:23:52.148 ] 00:23:52.148 }' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:52.148 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=758 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.148 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:52.414 "name": "raid_bdev1", 00:23:52.414 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:52.414 "strip_size_kb": 0, 00:23:52.414 "state": "online", 00:23:52.414 "raid_level": "raid1", 00:23:52.414 "superblock": true, 00:23:52.414 "num_base_bdevs": 2, 00:23:52.414 "num_base_bdevs_discovered": 2, 00:23:52.414 "num_base_bdevs_operational": 2, 00:23:52.414 "process": { 00:23:52.414 "type": "rebuild", 00:23:52.414 "target": "spare", 00:23:52.414 "progress": { 00:23:52.414 "blocks": 18432, 00:23:52.414 "percent": 29 00:23:52.414 } 00:23:52.414 }, 00:23:52.414 "base_bdevs_list": [ 00:23:52.414 { 00:23:52.414 "name": "spare", 00:23:52.414 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:52.414 "is_configured": true, 00:23:52.414 "data_offset": 2048, 00:23:52.414 "data_size": 63488 00:23:52.414 }, 00:23:52.414 { 00:23:52.414 "name": "BaseBdev2", 00:23:52.414 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:52.414 "is_configured": true, 00:23:52.414 "data_offset": 2048, 00:23:52.414 "data_size": 63488 00:23:52.414 } 00:23:52.414 ] 00:23:52.414 }' 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:52.414 16:00:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:52.685 [2024-07-12 16:00:12.921221] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.653 16:00:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.653 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:53.653 "name": "raid_bdev1", 00:23:53.653 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:53.653 "strip_size_kb": 0, 00:23:53.653 "state": "online", 00:23:53.653 "raid_level": "raid1", 00:23:53.653 "superblock": true, 00:23:53.653 "num_base_bdevs": 2, 00:23:53.653 "num_base_bdevs_discovered": 2, 00:23:53.653 "num_base_bdevs_operational": 2, 00:23:53.653 "process": { 00:23:53.653 "type": "rebuild", 00:23:53.653 "target": "spare", 00:23:53.653 "progress": { 00:23:53.653 "blocks": 38912, 00:23:53.653 "percent": 61 00:23:53.653 } 00:23:53.653 }, 00:23:53.653 "base_bdevs_list": [ 00:23:53.653 { 00:23:53.653 "name": "spare", 00:23:53.653 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:53.653 "is_configured": true, 00:23:53.653 "data_offset": 2048, 00:23:53.653 "data_size": 63488 00:23:53.653 }, 00:23:53.653 { 00:23:53.653 "name": "BaseBdev2", 00:23:53.653 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:53.653 "is_configured": true, 00:23:53.653 "data_offset": 2048, 00:23:53.653 "data_size": 63488 00:23:53.653 } 00:23:53.653 ] 00:23:53.653 }' 00:23:53.653 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:53.653 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:53.653 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:53.925 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:53.925 16:00:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:54.189 [2024-07-12 16:00:14.613167] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.758 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:55.016 "name": "raid_bdev1", 00:23:55.016 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:55.016 "strip_size_kb": 0, 00:23:55.016 "state": "online", 00:23:55.016 "raid_level": "raid1", 00:23:55.016 "superblock": true, 00:23:55.016 "num_base_bdevs": 2, 00:23:55.016 "num_base_bdevs_discovered": 2, 00:23:55.016 "num_base_bdevs_operational": 2, 00:23:55.016 "process": { 00:23:55.016 "type": "rebuild", 00:23:55.016 "target": "spare", 00:23:55.016 "progress": { 00:23:55.016 "blocks": 61440, 00:23:55.016 "percent": 96 00:23:55.016 } 00:23:55.016 }, 00:23:55.016 "base_bdevs_list": [ 00:23:55.016 { 00:23:55.016 "name": "spare", 00:23:55.016 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:55.016 "is_configured": true, 00:23:55.016 "data_offset": 2048, 00:23:55.016 "data_size": 63488 00:23:55.016 }, 00:23:55.016 { 00:23:55.016 "name": "BaseBdev2", 00:23:55.016 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:55.016 "is_configured": true, 00:23:55.016 "data_offset": 2048, 00:23:55.016 "data_size": 63488 00:23:55.016 } 00:23:55.016 ] 00:23:55.016 }' 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:55.016 [2024-07-12 16:00:15.360756] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:55.016 16:00:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:55.291 [2024-07-12 16:00:15.467442] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:55.291 [2024-07-12 16:00:15.469036] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:56.234 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.235 "name": "raid_bdev1", 00:23:56.235 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:56.235 "strip_size_kb": 0, 00:23:56.235 "state": "online", 00:23:56.235 "raid_level": "raid1", 00:23:56.235 "superblock": true, 00:23:56.235 "num_base_bdevs": 2, 00:23:56.235 "num_base_bdevs_discovered": 2, 00:23:56.235 "num_base_bdevs_operational": 2, 00:23:56.235 "base_bdevs_list": [ 00:23:56.235 { 00:23:56.235 "name": "spare", 00:23:56.235 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:56.235 "is_configured": true, 00:23:56.235 "data_offset": 2048, 00:23:56.235 "data_size": 63488 00:23:56.235 }, 00:23:56.235 { 00:23:56.235 "name": "BaseBdev2", 00:23:56.235 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:56.235 "is_configured": true, 00:23:56.235 "data_offset": 2048, 00:23:56.235 "data_size": 63488 00:23:56.235 } 00:23:56.235 ] 00:23:56.235 }' 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:56.235 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.504 "name": "raid_bdev1", 00:23:56.504 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:56.504 "strip_size_kb": 0, 00:23:56.504 "state": "online", 00:23:56.504 "raid_level": "raid1", 00:23:56.504 "superblock": true, 00:23:56.504 "num_base_bdevs": 2, 00:23:56.504 "num_base_bdevs_discovered": 2, 00:23:56.504 "num_base_bdevs_operational": 2, 00:23:56.504 "base_bdevs_list": [ 00:23:56.504 { 00:23:56.504 "name": "spare", 00:23:56.504 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:56.504 "is_configured": true, 00:23:56.504 "data_offset": 2048, 00:23:56.504 "data_size": 63488 00:23:56.504 }, 00:23:56.504 { 00:23:56.504 "name": "BaseBdev2", 00:23:56.504 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:56.504 "is_configured": true, 00:23:56.504 "data_offset": 2048, 00:23:56.504 "data_size": 63488 00:23:56.504 } 00:23:56.504 ] 00:23:56.504 }' 00:23:56.504 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:56.764 16:00:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:56.764 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:56.764 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.764 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:56.764 "name": "raid_bdev1", 00:23:56.764 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:56.764 "strip_size_kb": 0, 00:23:56.764 "state": "online", 00:23:56.764 "raid_level": "raid1", 00:23:56.764 "superblock": true, 00:23:56.764 "num_base_bdevs": 2, 00:23:56.764 "num_base_bdevs_discovered": 2, 00:23:56.764 "num_base_bdevs_operational": 2, 00:23:56.764 "base_bdevs_list": [ 00:23:56.764 { 00:23:56.764 "name": "spare", 00:23:56.764 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:56.764 "is_configured": true, 00:23:56.764 "data_offset": 2048, 00:23:56.764 "data_size": 63488 00:23:56.764 }, 00:23:56.764 { 00:23:56.764 "name": "BaseBdev2", 00:23:56.764 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:56.764 "is_configured": true, 00:23:56.764 "data_offset": 2048, 00:23:56.764 "data_size": 63488 00:23:56.764 } 00:23:56.764 ] 00:23:56.764 }' 00:23:56.764 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:56.764 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:57.333 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:57.592 [2024-07-12 16:00:17.900320] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:57.593 [2024-07-12 16:00:17.900342] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:57.593 00:23:57.593 Latency(us) 00:23:57.593 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:57.593 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:57.593 raid_bdev1 : 10.73 110.49 331.46 0.00 0.00 12292.73 244.18 114536.76 00:23:57.593 =================================================================================================================== 00:23:57.593 Total : 110.49 331.46 0.00 0.00 12292.73 244.18 114536.76 00:23:57.593 [2024-07-12 16:00:17.951636] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:57.593 [2024-07-12 16:00:17.951658] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:57.593 [2024-07-12 16:00:17.951718] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:57.593 [2024-07-12 16:00:17.951724] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2519d30 name raid_bdev1, state offline 00:23:57.593 0 00:23:57.593 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.593 16:00:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:57.852 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:58.112 /dev/nbd0 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:58.112 1+0 records in 00:23:58.112 1+0 records out 00:23:58.112 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308717 s, 13.3 MB/s 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.112 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:58.371 /dev/nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:58.371 1+0 records in 00:23:58.371 1+0 records out 00:23:58.371 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285171 s, 14.4 MB/s 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:58.371 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:58.630 16:00:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:58.889 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:59.149 [2024-07-12 16:00:19.474310] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:59.149 [2024-07-12 16:00:19.474341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:59.149 [2024-07-12 16:00:19.474353] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251b840 00:23:59.149 [2024-07-12 16:00:19.474359] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:59.149 [2024-07-12 16:00:19.475658] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:59.149 [2024-07-12 16:00:19.475685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:59.149 [2024-07-12 16:00:19.475751] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:59.149 [2024-07-12 16:00:19.475772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:59.149 [2024-07-12 16:00:19.475851] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:59.149 spare 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.149 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.149 [2024-07-12 16:00:19.576141] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26c9af0 00:23:59.149 [2024-07-12 16:00:19.576150] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:59.149 [2024-07-12 16:00:19.576305] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26c7580 00:23:59.149 [2024-07-12 16:00:19.576418] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26c9af0 00:23:59.149 [2024-07-12 16:00:19.576423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26c9af0 00:23:59.149 [2024-07-12 16:00:19.576504] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.409 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.409 "name": "raid_bdev1", 00:23:59.409 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:23:59.409 "strip_size_kb": 0, 00:23:59.409 "state": "online", 00:23:59.409 "raid_level": "raid1", 00:23:59.409 "superblock": true, 00:23:59.409 "num_base_bdevs": 2, 00:23:59.409 "num_base_bdevs_discovered": 2, 00:23:59.409 "num_base_bdevs_operational": 2, 00:23:59.409 "base_bdevs_list": [ 00:23:59.409 { 00:23:59.409 "name": "spare", 00:23:59.409 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:23:59.409 "is_configured": true, 00:23:59.409 "data_offset": 2048, 00:23:59.409 "data_size": 63488 00:23:59.409 }, 00:23:59.409 { 00:23:59.409 "name": "BaseBdev2", 00:23:59.409 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:23:59.409 "is_configured": true, 00:23:59.409 "data_offset": 2048, 00:23:59.409 "data_size": 63488 00:23:59.409 } 00:23:59.409 ] 00:23:59.409 }' 00:23:59.409 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.409 16:00:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.978 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.548 "name": "raid_bdev1", 00:24:00.548 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:00.548 "strip_size_kb": 0, 00:24:00.548 "state": "online", 00:24:00.548 "raid_level": "raid1", 00:24:00.548 "superblock": true, 00:24:00.548 "num_base_bdevs": 2, 00:24:00.548 "num_base_bdevs_discovered": 2, 00:24:00.548 "num_base_bdevs_operational": 2, 00:24:00.548 "base_bdevs_list": [ 00:24:00.548 { 00:24:00.548 "name": "spare", 00:24:00.548 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:24:00.548 "is_configured": true, 00:24:00.548 "data_offset": 2048, 00:24:00.548 "data_size": 63488 00:24:00.548 }, 00:24:00.548 { 00:24:00.548 "name": "BaseBdev2", 00:24:00.548 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:00.548 "is_configured": true, 00:24:00.548 "data_offset": 2048, 00:24:00.548 "data_size": 63488 00:24:00.548 } 00:24:00.548 ] 00:24:00.548 }' 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.548 16:00:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:00.807 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:00.807 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:00.807 [2024-07-12 16:00:21.251086] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.067 "name": "raid_bdev1", 00:24:01.067 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:01.067 "strip_size_kb": 0, 00:24:01.067 "state": "online", 00:24:01.067 "raid_level": "raid1", 00:24:01.067 "superblock": true, 00:24:01.067 "num_base_bdevs": 2, 00:24:01.067 "num_base_bdevs_discovered": 1, 00:24:01.067 "num_base_bdevs_operational": 1, 00:24:01.067 "base_bdevs_list": [ 00:24:01.067 { 00:24:01.067 "name": null, 00:24:01.067 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.067 "is_configured": false, 00:24:01.067 "data_offset": 2048, 00:24:01.067 "data_size": 63488 00:24:01.067 }, 00:24:01.067 { 00:24:01.067 "name": "BaseBdev2", 00:24:01.067 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:01.067 "is_configured": true, 00:24:01.067 "data_offset": 2048, 00:24:01.067 "data_size": 63488 00:24:01.067 } 00:24:01.067 ] 00:24:01.067 }' 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.067 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:01.637 16:00:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:01.917 [2024-07-12 16:00:22.173542] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:01.917 [2024-07-12 16:00:22.173654] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:01.917 [2024-07-12 16:00:22.173663] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:01.917 [2024-07-12 16:00:22.173682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:01.917 [2024-07-12 16:00:22.177304] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25678c0 00:24:01.917 [2024-07-12 16:00:22.178875] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:01.917 16:00:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:02.875 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.876 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:03.135 "name": "raid_bdev1", 00:24:03.135 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:03.135 "strip_size_kb": 0, 00:24:03.135 "state": "online", 00:24:03.135 "raid_level": "raid1", 00:24:03.135 "superblock": true, 00:24:03.135 "num_base_bdevs": 2, 00:24:03.135 "num_base_bdevs_discovered": 2, 00:24:03.135 "num_base_bdevs_operational": 2, 00:24:03.135 "process": { 00:24:03.135 "type": "rebuild", 00:24:03.135 "target": "spare", 00:24:03.135 "progress": { 00:24:03.135 "blocks": 22528, 00:24:03.135 "percent": 35 00:24:03.135 } 00:24:03.135 }, 00:24:03.135 "base_bdevs_list": [ 00:24:03.135 { 00:24:03.135 "name": "spare", 00:24:03.135 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:24:03.135 "is_configured": true, 00:24:03.135 "data_offset": 2048, 00:24:03.135 "data_size": 63488 00:24:03.135 }, 00:24:03.135 { 00:24:03.135 "name": "BaseBdev2", 00:24:03.135 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:03.135 "is_configured": true, 00:24:03.135 "data_offset": 2048, 00:24:03.135 "data_size": 63488 00:24:03.135 } 00:24:03.135 ] 00:24:03.135 }' 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:03.135 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:03.394 [2024-07-12 16:00:23.667394] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:03.394 [2024-07-12 16:00:23.687697] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:03.394 [2024-07-12 16:00:23.687728] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.394 [2024-07-12 16:00:23.687738] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:03.394 [2024-07-12 16:00:23.687743] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.394 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.654 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.654 "name": "raid_bdev1", 00:24:03.654 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:03.654 "strip_size_kb": 0, 00:24:03.654 "state": "online", 00:24:03.654 "raid_level": "raid1", 00:24:03.654 "superblock": true, 00:24:03.654 "num_base_bdevs": 2, 00:24:03.654 "num_base_bdevs_discovered": 1, 00:24:03.654 "num_base_bdevs_operational": 1, 00:24:03.654 "base_bdevs_list": [ 00:24:03.654 { 00:24:03.654 "name": null, 00:24:03.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.654 "is_configured": false, 00:24:03.654 "data_offset": 2048, 00:24:03.654 "data_size": 63488 00:24:03.654 }, 00:24:03.654 { 00:24:03.654 "name": "BaseBdev2", 00:24:03.654 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:03.654 "is_configured": true, 00:24:03.654 "data_offset": 2048, 00:24:03.654 "data_size": 63488 00:24:03.654 } 00:24:03.654 ] 00:24:03.654 }' 00:24:03.654 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.654 16:00:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:04.222 16:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:04.222 [2024-07-12 16:00:24.629956] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:04.222 [2024-07-12 16:00:24.629991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:04.222 [2024-07-12 16:00:24.630004] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x251ffa0 00:24:04.222 [2024-07-12 16:00:24.630011] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:04.222 [2024-07-12 16:00:24.630308] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:04.222 [2024-07-12 16:00:24.630319] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:04.222 [2024-07-12 16:00:24.630378] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:04.222 [2024-07-12 16:00:24.630386] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:24:04.222 [2024-07-12 16:00:24.630392] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:04.222 [2024-07-12 16:00:24.630404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:04.222 [2024-07-12 16:00:24.633975] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25b8160 00:24:04.222 spare 00:24:04.222 [2024-07-12 16:00:24.635101] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:04.222 16:00:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.602 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:05.602 "name": "raid_bdev1", 00:24:05.602 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:05.602 "strip_size_kb": 0, 00:24:05.602 "state": "online", 00:24:05.602 "raid_level": "raid1", 00:24:05.602 "superblock": true, 00:24:05.602 "num_base_bdevs": 2, 00:24:05.602 "num_base_bdevs_discovered": 2, 00:24:05.602 "num_base_bdevs_operational": 2, 00:24:05.602 "process": { 00:24:05.602 "type": "rebuild", 00:24:05.602 "target": "spare", 00:24:05.602 "progress": { 00:24:05.602 "blocks": 22528, 00:24:05.602 "percent": 35 00:24:05.602 } 00:24:05.602 }, 00:24:05.602 "base_bdevs_list": [ 00:24:05.602 { 00:24:05.602 "name": "spare", 00:24:05.602 "uuid": "2355f73e-a7f5-5c61-bcab-5d48db46bb25", 00:24:05.602 "is_configured": true, 00:24:05.602 "data_offset": 2048, 00:24:05.602 "data_size": 63488 00:24:05.602 }, 00:24:05.602 { 00:24:05.602 "name": "BaseBdev2", 00:24:05.602 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:05.602 "is_configured": true, 00:24:05.602 "data_offset": 2048, 00:24:05.602 "data_size": 63488 00:24:05.602 } 00:24:05.602 ] 00:24:05.603 }' 00:24:05.603 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:05.603 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:05.603 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:05.603 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:05.603 16:00:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:05.863 [2024-07-12 16:00:26.087005] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:05.863 [2024-07-12 16:00:26.144000] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:05.863 [2024-07-12 16:00:26.144029] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:05.863 [2024-07-12 16:00:26.144038] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:05.863 [2024-07-12 16:00:26.144043] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.863 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.122 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:06.122 "name": "raid_bdev1", 00:24:06.122 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:06.122 "strip_size_kb": 0, 00:24:06.122 "state": "online", 00:24:06.122 "raid_level": "raid1", 00:24:06.122 "superblock": true, 00:24:06.122 "num_base_bdevs": 2, 00:24:06.122 "num_base_bdevs_discovered": 1, 00:24:06.122 "num_base_bdevs_operational": 1, 00:24:06.122 "base_bdevs_list": [ 00:24:06.122 { 00:24:06.122 "name": null, 00:24:06.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.122 "is_configured": false, 00:24:06.122 "data_offset": 2048, 00:24:06.122 "data_size": 63488 00:24:06.122 }, 00:24:06.122 { 00:24:06.122 "name": "BaseBdev2", 00:24:06.122 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:06.122 "is_configured": true, 00:24:06.122 "data_offset": 2048, 00:24:06.122 "data_size": 63488 00:24:06.122 } 00:24:06.122 ] 00:24:06.122 }' 00:24:06.122 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:06.122 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.695 16:00:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.695 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.695 "name": "raid_bdev1", 00:24:06.695 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:06.695 "strip_size_kb": 0, 00:24:06.695 "state": "online", 00:24:06.695 "raid_level": "raid1", 00:24:06.695 "superblock": true, 00:24:06.695 "num_base_bdevs": 2, 00:24:06.695 "num_base_bdevs_discovered": 1, 00:24:06.695 "num_base_bdevs_operational": 1, 00:24:06.695 "base_bdevs_list": [ 00:24:06.695 { 00:24:06.695 "name": null, 00:24:06.695 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:06.695 "is_configured": false, 00:24:06.695 "data_offset": 2048, 00:24:06.695 "data_size": 63488 00:24:06.695 }, 00:24:06.695 { 00:24:06.695 "name": "BaseBdev2", 00:24:06.695 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:06.695 "is_configured": true, 00:24:06.695 "data_offset": 2048, 00:24:06.695 "data_size": 63488 00:24:06.695 } 00:24:06.695 ] 00:24:06.695 }' 00:24:06.695 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.695 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:06.695 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.955 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:06.955 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:06.955 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:07.214 [2024-07-12 16:00:27.510909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:07.214 [2024-07-12 16:00:27.510937] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:07.214 [2024-07-12 16:00:27.510949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2521240 00:24:07.214 [2024-07-12 16:00:27.510956] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:07.214 [2024-07-12 16:00:27.511218] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:07.214 [2024-07-12 16:00:27.511228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:07.214 [2024-07-12 16:00:27.511272] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:07.214 [2024-07-12 16:00:27.511278] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:07.214 [2024-07-12 16:00:27.511283] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:07.214 BaseBdev1 00:24:07.214 16:00:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:08.178 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:08.437 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:08.437 "name": "raid_bdev1", 00:24:08.437 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:08.437 "strip_size_kb": 0, 00:24:08.437 "state": "online", 00:24:08.437 "raid_level": "raid1", 00:24:08.437 "superblock": true, 00:24:08.437 "num_base_bdevs": 2, 00:24:08.437 "num_base_bdevs_discovered": 1, 00:24:08.437 "num_base_bdevs_operational": 1, 00:24:08.437 "base_bdevs_list": [ 00:24:08.437 { 00:24:08.437 "name": null, 00:24:08.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:08.437 "is_configured": false, 00:24:08.437 "data_offset": 2048, 00:24:08.437 "data_size": 63488 00:24:08.437 }, 00:24:08.437 { 00:24:08.437 "name": "BaseBdev2", 00:24:08.437 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:08.437 "is_configured": true, 00:24:08.437 "data_offset": 2048, 00:24:08.437 "data_size": 63488 00:24:08.437 } 00:24:08.437 ] 00:24:08.437 }' 00:24:08.437 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:08.437 16:00:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.006 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.265 "name": "raid_bdev1", 00:24:09.265 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:09.265 "strip_size_kb": 0, 00:24:09.265 "state": "online", 00:24:09.265 "raid_level": "raid1", 00:24:09.265 "superblock": true, 00:24:09.265 "num_base_bdevs": 2, 00:24:09.265 "num_base_bdevs_discovered": 1, 00:24:09.265 "num_base_bdevs_operational": 1, 00:24:09.265 "base_bdevs_list": [ 00:24:09.265 { 00:24:09.265 "name": null, 00:24:09.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.265 "is_configured": false, 00:24:09.265 "data_offset": 2048, 00:24:09.265 "data_size": 63488 00:24:09.265 }, 00:24:09.265 { 00:24:09.265 "name": "BaseBdev2", 00:24:09.265 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:09.265 "is_configured": true, 00:24:09.265 "data_offset": 2048, 00:24:09.265 "data_size": 63488 00:24:09.265 } 00:24:09.265 ] 00:24:09.265 }' 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:09.265 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:09.525 [2024-07-12 16:00:29.748930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:09.525 [2024-07-12 16:00:29.749014] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:24:09.525 [2024-07-12 16:00:29.749022] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:09.525 request: 00:24:09.525 { 00:24:09.525 "base_bdev": "BaseBdev1", 00:24:09.525 "raid_bdev": "raid_bdev1", 00:24:09.525 "method": "bdev_raid_add_base_bdev", 00:24:09.525 "req_id": 1 00:24:09.525 } 00:24:09.525 Got JSON-RPC error response 00:24:09.525 response: 00:24:09.525 { 00:24:09.525 "code": -22, 00:24:09.525 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:09.525 } 00:24:09.526 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:24:09.526 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:24:09.526 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:24:09.526 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:24:09.526 16:00:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.464 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.724 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:10.724 "name": "raid_bdev1", 00:24:10.724 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:10.724 "strip_size_kb": 0, 00:24:10.724 "state": "online", 00:24:10.724 "raid_level": "raid1", 00:24:10.724 "superblock": true, 00:24:10.724 "num_base_bdevs": 2, 00:24:10.724 "num_base_bdevs_discovered": 1, 00:24:10.724 "num_base_bdevs_operational": 1, 00:24:10.724 "base_bdevs_list": [ 00:24:10.724 { 00:24:10.724 "name": null, 00:24:10.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.724 "is_configured": false, 00:24:10.724 "data_offset": 2048, 00:24:10.724 "data_size": 63488 00:24:10.724 }, 00:24:10.724 { 00:24:10.724 "name": "BaseBdev2", 00:24:10.724 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:10.724 "is_configured": true, 00:24:10.724 "data_offset": 2048, 00:24:10.724 "data_size": 63488 00:24:10.724 } 00:24:10.724 ] 00:24:10.724 }' 00:24:10.724 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:10.724 16:00:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.294 "name": "raid_bdev1", 00:24:11.294 "uuid": "f6252f25-61a5-431f-ac83-d889f0460c53", 00:24:11.294 "strip_size_kb": 0, 00:24:11.294 "state": "online", 00:24:11.294 "raid_level": "raid1", 00:24:11.294 "superblock": true, 00:24:11.294 "num_base_bdevs": 2, 00:24:11.294 "num_base_bdevs_discovered": 1, 00:24:11.294 "num_base_bdevs_operational": 1, 00:24:11.294 "base_bdevs_list": [ 00:24:11.294 { 00:24:11.294 "name": null, 00:24:11.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.294 "is_configured": false, 00:24:11.294 "data_offset": 2048, 00:24:11.294 "data_size": 63488 00:24:11.294 }, 00:24:11.294 { 00:24:11.294 "name": "BaseBdev2", 00:24:11.294 "uuid": "b109fb4b-7883-519f-adf4-48fe79d16b16", 00:24:11.294 "is_configured": true, 00:24:11.294 "data_offset": 2048, 00:24:11.294 "data_size": 63488 00:24:11.294 } 00:24:11.294 ] 00:24:11.294 }' 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:11.294 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2632412 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2632412 ']' 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2632412 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2632412 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2632412' 00:24:11.554 killing process with pid 2632412 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2632412 00:24:11.554 Received shutdown signal, test time was about 24.578847 seconds 00:24:11.554 00:24:11.554 Latency(us) 00:24:11.554 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:11.554 =================================================================================================================== 00:24:11.554 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:11.554 [2024-07-12 16:00:31.825346] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:11.554 [2024-07-12 16:00:31.825416] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:11.554 [2024-07-12 16:00:31.825450] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:11.554 [2024-07-12 16:00:31.825456] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26c9af0 name raid_bdev1, state offline 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2632412 00:24:11.554 [2024-07-12 16:00:31.837390] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:11.554 00:24:11.554 real 0m29.534s 00:24:11.554 user 0m46.691s 00:24:11.554 sys 0m3.278s 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:11.554 16:00:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:11.554 ************************************ 00:24:11.554 END TEST raid_rebuild_test_sb_io 00:24:11.554 ************************************ 00:24:11.815 16:00:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:11.815 16:00:32 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:11.815 16:00:32 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:24:11.815 16:00:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:11.815 16:00:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:11.815 16:00:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:11.815 ************************************ 00:24:11.815 START TEST raid_rebuild_test 00:24:11.815 ************************************ 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:11.815 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2638216 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2638216 /var/tmp/spdk-raid.sock 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 2638216 ']' 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:11.816 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:11.816 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:11.816 [2024-07-12 16:00:32.105465] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:24:11.816 [2024-07-12 16:00:32.105512] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2638216 ] 00:24:11.816 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:11.816 Zero copy mechanism will not be used. 00:24:11.816 [2024-07-12 16:00:32.193470] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:11.816 [2024-07-12 16:00:32.258090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:12.076 [2024-07-12 16:00:32.301660] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:12.076 [2024-07-12 16:00:32.301686] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:12.646 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:12.646 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:12.646 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:12.646 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:12.907 BaseBdev1_malloc 00:24:12.907 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:12.907 [2024-07-12 16:00:33.312044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:12.907 [2024-07-12 16:00:33.312077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:12.907 [2024-07-12 16:00:33.312091] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd2a010 00:24:12.907 [2024-07-12 16:00:33.312101] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:12.907 [2024-07-12 16:00:33.313382] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:12.907 [2024-07-12 16:00:33.313401] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:12.907 BaseBdev1 00:24:12.907 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:12.907 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:13.166 BaseBdev2_malloc 00:24:13.166 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:13.427 [2024-07-12 16:00:33.691028] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:13.427 [2024-07-12 16:00:33.691056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:13.427 [2024-07-12 16:00:33.691069] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd2ac30 00:24:13.427 [2024-07-12 16:00:33.691075] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:13.427 [2024-07-12 16:00:33.692257] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:13.427 [2024-07-12 16:00:33.692275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:13.427 BaseBdev2 00:24:13.427 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:13.427 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:13.427 BaseBdev3_malloc 00:24:13.687 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:13.687 [2024-07-12 16:00:34.057848] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:13.687 [2024-07-12 16:00:34.057876] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:13.687 [2024-07-12 16:00:34.057887] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed22c0 00:24:13.687 [2024-07-12 16:00:34.057893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:13.687 [2024-07-12 16:00:34.059063] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:13.687 [2024-07-12 16:00:34.059081] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:13.687 BaseBdev3 00:24:13.687 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:13.687 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:13.947 BaseBdev4_malloc 00:24:13.947 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:14.207 [2024-07-12 16:00:34.424714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:14.207 [2024-07-12 16:00:34.424741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:14.207 [2024-07-12 16:00:34.424753] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd224e0 00:24:14.207 [2024-07-12 16:00:34.424760] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:14.207 [2024-07-12 16:00:34.425922] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:14.207 [2024-07-12 16:00:34.425940] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:14.207 BaseBdev4 00:24:14.207 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:14.207 spare_malloc 00:24:14.207 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:14.467 spare_delay 00:24:14.467 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:14.726 [2024-07-12 16:00:34.996044] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:14.726 [2024-07-12 16:00:34.996070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:14.727 [2024-07-12 16:00:34.996083] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd20c80 00:24:14.727 [2024-07-12 16:00:34.996088] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:14.727 [2024-07-12 16:00:34.997277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:14.727 [2024-07-12 16:00:34.997295] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:14.727 spare 00:24:14.727 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:14.987 [2024-07-12 16:00:35.184539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:14.987 [2024-07-12 16:00:35.185535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:14.987 [2024-07-12 16:00:35.185575] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:14.987 [2024-07-12 16:00:35.185609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:14.987 [2024-07-12 16:00:35.185670] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd235b0 00:24:14.987 [2024-07-12 16:00:35.185676] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:14.987 [2024-07-12 16:00:35.185838] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd27b50 00:24:14.987 [2024-07-12 16:00:35.185952] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd235b0 00:24:14.987 [2024-07-12 16:00:35.185957] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd235b0 00:24:14.987 [2024-07-12 16:00:35.186040] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:14.987 "name": "raid_bdev1", 00:24:14.987 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:14.987 "strip_size_kb": 0, 00:24:14.987 "state": "online", 00:24:14.987 "raid_level": "raid1", 00:24:14.987 "superblock": false, 00:24:14.987 "num_base_bdevs": 4, 00:24:14.987 "num_base_bdevs_discovered": 4, 00:24:14.987 "num_base_bdevs_operational": 4, 00:24:14.987 "base_bdevs_list": [ 00:24:14.987 { 00:24:14.987 "name": "BaseBdev1", 00:24:14.987 "uuid": "1dde94d5-8783-53e2-a607-a1d14922e117", 00:24:14.987 "is_configured": true, 00:24:14.987 "data_offset": 0, 00:24:14.987 "data_size": 65536 00:24:14.987 }, 00:24:14.987 { 00:24:14.987 "name": "BaseBdev2", 00:24:14.987 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:14.987 "is_configured": true, 00:24:14.987 "data_offset": 0, 00:24:14.987 "data_size": 65536 00:24:14.987 }, 00:24:14.987 { 00:24:14.987 "name": "BaseBdev3", 00:24:14.987 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:14.987 "is_configured": true, 00:24:14.987 "data_offset": 0, 00:24:14.987 "data_size": 65536 00:24:14.987 }, 00:24:14.987 { 00:24:14.987 "name": "BaseBdev4", 00:24:14.987 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:14.987 "is_configured": true, 00:24:14.987 "data_offset": 0, 00:24:14.987 "data_size": 65536 00:24:14.987 } 00:24:14.987 ] 00:24:14.987 }' 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:14.987 16:00:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:15.557 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:15.557 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:15.818 [2024-07-12 16:00:36.103195] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:15.818 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:15.818 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.818 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:16.078 [2024-07-12 16:00:36.491965] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd21a80 00:24:16.078 /dev/nbd0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:16.078 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:16.337 1+0 records in 00:24:16.337 1+0 records out 00:24:16.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306108 s, 13.4 MB/s 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:16.337 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:26.331 65536+0 records in 00:24:26.331 65536+0 records out 00:24:26.331 33554432 bytes (34 MB, 32 MiB) copied, 9.75453 s, 3.4 MB/s 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:26.331 [2024-07-12 16:00:46.504408] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:26.331 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:26.589 [2024-07-12 16:00:47.017741] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.849 "name": "raid_bdev1", 00:24:26.849 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:26.849 "strip_size_kb": 0, 00:24:26.849 "state": "online", 00:24:26.849 "raid_level": "raid1", 00:24:26.849 "superblock": false, 00:24:26.849 "num_base_bdevs": 4, 00:24:26.849 "num_base_bdevs_discovered": 3, 00:24:26.849 "num_base_bdevs_operational": 3, 00:24:26.849 "base_bdevs_list": [ 00:24:26.849 { 00:24:26.849 "name": null, 00:24:26.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.849 "is_configured": false, 00:24:26.849 "data_offset": 0, 00:24:26.849 "data_size": 65536 00:24:26.849 }, 00:24:26.849 { 00:24:26.849 "name": "BaseBdev2", 00:24:26.849 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:26.849 "is_configured": true, 00:24:26.849 "data_offset": 0, 00:24:26.849 "data_size": 65536 00:24:26.849 }, 00:24:26.849 { 00:24:26.849 "name": "BaseBdev3", 00:24:26.849 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:26.849 "is_configured": true, 00:24:26.849 "data_offset": 0, 00:24:26.849 "data_size": 65536 00:24:26.849 }, 00:24:26.849 { 00:24:26.849 "name": "BaseBdev4", 00:24:26.849 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:26.849 "is_configured": true, 00:24:26.849 "data_offset": 0, 00:24:26.849 "data_size": 65536 00:24:26.849 } 00:24:26.849 ] 00:24:26.849 }' 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.849 16:00:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.417 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:27.676 [2024-07-12 16:00:47.936052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:27.676 [2024-07-12 16:00:47.938900] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd25350 00:24:27.676 [2024-07-12 16:00:47.940507] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:27.676 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:28.614 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:28.614 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.614 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:28.614 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:28.614 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.615 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.615 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:28.910 "name": "raid_bdev1", 00:24:28.910 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:28.910 "strip_size_kb": 0, 00:24:28.910 "state": "online", 00:24:28.910 "raid_level": "raid1", 00:24:28.910 "superblock": false, 00:24:28.910 "num_base_bdevs": 4, 00:24:28.910 "num_base_bdevs_discovered": 4, 00:24:28.910 "num_base_bdevs_operational": 4, 00:24:28.910 "process": { 00:24:28.910 "type": "rebuild", 00:24:28.910 "target": "spare", 00:24:28.910 "progress": { 00:24:28.910 "blocks": 22528, 00:24:28.910 "percent": 34 00:24:28.910 } 00:24:28.910 }, 00:24:28.910 "base_bdevs_list": [ 00:24:28.910 { 00:24:28.910 "name": "spare", 00:24:28.910 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:28.910 "is_configured": true, 00:24:28.910 "data_offset": 0, 00:24:28.910 "data_size": 65536 00:24:28.910 }, 00:24:28.910 { 00:24:28.910 "name": "BaseBdev2", 00:24:28.910 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:28.910 "is_configured": true, 00:24:28.910 "data_offset": 0, 00:24:28.910 "data_size": 65536 00:24:28.910 }, 00:24:28.910 { 00:24:28.910 "name": "BaseBdev3", 00:24:28.910 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:28.910 "is_configured": true, 00:24:28.910 "data_offset": 0, 00:24:28.910 "data_size": 65536 00:24:28.910 }, 00:24:28.910 { 00:24:28.910 "name": "BaseBdev4", 00:24:28.910 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:28.910 "is_configured": true, 00:24:28.910 "data_offset": 0, 00:24:28.910 "data_size": 65536 00:24:28.910 } 00:24:28.910 ] 00:24:28.910 }' 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:28.910 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:29.175 [2024-07-12 16:00:49.437424] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:29.175 [2024-07-12 16:00:49.449444] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:29.175 [2024-07-12 16:00:49.449474] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.175 [2024-07-12 16:00:49.449485] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:29.175 [2024-07-12 16:00:49.449490] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.175 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.176 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.176 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.176 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.176 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.436 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:29.436 "name": "raid_bdev1", 00:24:29.436 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:29.436 "strip_size_kb": 0, 00:24:29.436 "state": "online", 00:24:29.436 "raid_level": "raid1", 00:24:29.436 "superblock": false, 00:24:29.436 "num_base_bdevs": 4, 00:24:29.436 "num_base_bdevs_discovered": 3, 00:24:29.436 "num_base_bdevs_operational": 3, 00:24:29.436 "base_bdevs_list": [ 00:24:29.436 { 00:24:29.436 "name": null, 00:24:29.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.436 "is_configured": false, 00:24:29.436 "data_offset": 0, 00:24:29.436 "data_size": 65536 00:24:29.436 }, 00:24:29.436 { 00:24:29.436 "name": "BaseBdev2", 00:24:29.436 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:29.436 "is_configured": true, 00:24:29.436 "data_offset": 0, 00:24:29.436 "data_size": 65536 00:24:29.436 }, 00:24:29.436 { 00:24:29.436 "name": "BaseBdev3", 00:24:29.436 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:29.436 "is_configured": true, 00:24:29.436 "data_offset": 0, 00:24:29.436 "data_size": 65536 00:24:29.436 }, 00:24:29.436 { 00:24:29.436 "name": "BaseBdev4", 00:24:29.436 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:29.436 "is_configured": true, 00:24:29.436 "data_offset": 0, 00:24:29.436 "data_size": 65536 00:24:29.436 } 00:24:29.436 ] 00:24:29.436 }' 00:24:29.436 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:29.436 16:00:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.005 "name": "raid_bdev1", 00:24:30.005 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:30.005 "strip_size_kb": 0, 00:24:30.005 "state": "online", 00:24:30.005 "raid_level": "raid1", 00:24:30.005 "superblock": false, 00:24:30.005 "num_base_bdevs": 4, 00:24:30.005 "num_base_bdevs_discovered": 3, 00:24:30.005 "num_base_bdevs_operational": 3, 00:24:30.005 "base_bdevs_list": [ 00:24:30.005 { 00:24:30.005 "name": null, 00:24:30.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.005 "is_configured": false, 00:24:30.005 "data_offset": 0, 00:24:30.005 "data_size": 65536 00:24:30.005 }, 00:24:30.005 { 00:24:30.005 "name": "BaseBdev2", 00:24:30.005 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:30.005 "is_configured": true, 00:24:30.005 "data_offset": 0, 00:24:30.005 "data_size": 65536 00:24:30.005 }, 00:24:30.005 { 00:24:30.005 "name": "BaseBdev3", 00:24:30.005 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:30.005 "is_configured": true, 00:24:30.005 "data_offset": 0, 00:24:30.005 "data_size": 65536 00:24:30.005 }, 00:24:30.005 { 00:24:30.005 "name": "BaseBdev4", 00:24:30.005 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:30.005 "is_configured": true, 00:24:30.005 "data_offset": 0, 00:24:30.005 "data_size": 65536 00:24:30.005 } 00:24:30.005 ] 00:24:30.005 }' 00:24:30.005 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.265 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:30.265 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.265 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:30.265 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:30.265 [2024-07-12 16:00:50.676738] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:30.265 [2024-07-12 16:00:50.679536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd27b30 00:24:30.265 [2024-07-12 16:00:50.680743] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:30.265 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.646 "name": "raid_bdev1", 00:24:31.646 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:31.646 "strip_size_kb": 0, 00:24:31.646 "state": "online", 00:24:31.646 "raid_level": "raid1", 00:24:31.646 "superblock": false, 00:24:31.646 "num_base_bdevs": 4, 00:24:31.646 "num_base_bdevs_discovered": 4, 00:24:31.646 "num_base_bdevs_operational": 4, 00:24:31.646 "process": { 00:24:31.646 "type": "rebuild", 00:24:31.646 "target": "spare", 00:24:31.646 "progress": { 00:24:31.646 "blocks": 22528, 00:24:31.646 "percent": 34 00:24:31.646 } 00:24:31.646 }, 00:24:31.646 "base_bdevs_list": [ 00:24:31.646 { 00:24:31.646 "name": "spare", 00:24:31.646 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:31.646 "is_configured": true, 00:24:31.646 "data_offset": 0, 00:24:31.646 "data_size": 65536 00:24:31.646 }, 00:24:31.646 { 00:24:31.646 "name": "BaseBdev2", 00:24:31.646 "uuid": "5d7403da-f99f-5d9a-967c-22d4092ad940", 00:24:31.646 "is_configured": true, 00:24:31.646 "data_offset": 0, 00:24:31.646 "data_size": 65536 00:24:31.646 }, 00:24:31.646 { 00:24:31.646 "name": "BaseBdev3", 00:24:31.646 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:31.646 "is_configured": true, 00:24:31.646 "data_offset": 0, 00:24:31.646 "data_size": 65536 00:24:31.646 }, 00:24:31.646 { 00:24:31.646 "name": "BaseBdev4", 00:24:31.646 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:31.646 "is_configured": true, 00:24:31.646 "data_offset": 0, 00:24:31.646 "data_size": 65536 00:24:31.646 } 00:24:31.646 ] 00:24:31.646 }' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:31.646 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:31.906 [2024-07-12 16:00:52.161107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:31.906 [2024-07-12 16:00:52.189759] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xd27b30 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.906 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:32.166 "name": "raid_bdev1", 00:24:32.166 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:32.166 "strip_size_kb": 0, 00:24:32.166 "state": "online", 00:24:32.166 "raid_level": "raid1", 00:24:32.166 "superblock": false, 00:24:32.166 "num_base_bdevs": 4, 00:24:32.166 "num_base_bdevs_discovered": 3, 00:24:32.166 "num_base_bdevs_operational": 3, 00:24:32.166 "process": { 00:24:32.166 "type": "rebuild", 00:24:32.166 "target": "spare", 00:24:32.166 "progress": { 00:24:32.166 "blocks": 32768, 00:24:32.166 "percent": 50 00:24:32.166 } 00:24:32.166 }, 00:24:32.166 "base_bdevs_list": [ 00:24:32.166 { 00:24:32.166 "name": "spare", 00:24:32.166 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:32.166 "is_configured": true, 00:24:32.166 "data_offset": 0, 00:24:32.166 "data_size": 65536 00:24:32.166 }, 00:24:32.166 { 00:24:32.166 "name": null, 00:24:32.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.166 "is_configured": false, 00:24:32.166 "data_offset": 0, 00:24:32.166 "data_size": 65536 00:24:32.166 }, 00:24:32.166 { 00:24:32.166 "name": "BaseBdev3", 00:24:32.166 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:32.166 "is_configured": true, 00:24:32.166 "data_offset": 0, 00:24:32.166 "data_size": 65536 00:24:32.166 }, 00:24:32.166 { 00:24:32.166 "name": "BaseBdev4", 00:24:32.166 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:32.166 "is_configured": true, 00:24:32.166 "data_offset": 0, 00:24:32.166 "data_size": 65536 00:24:32.166 } 00:24:32.166 ] 00:24:32.166 }' 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=798 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.166 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:32.426 "name": "raid_bdev1", 00:24:32.426 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:32.426 "strip_size_kb": 0, 00:24:32.426 "state": "online", 00:24:32.426 "raid_level": "raid1", 00:24:32.426 "superblock": false, 00:24:32.426 "num_base_bdevs": 4, 00:24:32.426 "num_base_bdevs_discovered": 3, 00:24:32.426 "num_base_bdevs_operational": 3, 00:24:32.426 "process": { 00:24:32.426 "type": "rebuild", 00:24:32.426 "target": "spare", 00:24:32.426 "progress": { 00:24:32.426 "blocks": 38912, 00:24:32.426 "percent": 59 00:24:32.426 } 00:24:32.426 }, 00:24:32.426 "base_bdevs_list": [ 00:24:32.426 { 00:24:32.426 "name": "spare", 00:24:32.426 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:32.426 "is_configured": true, 00:24:32.426 "data_offset": 0, 00:24:32.426 "data_size": 65536 00:24:32.426 }, 00:24:32.426 { 00:24:32.426 "name": null, 00:24:32.426 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:32.426 "is_configured": false, 00:24:32.426 "data_offset": 0, 00:24:32.426 "data_size": 65536 00:24:32.426 }, 00:24:32.426 { 00:24:32.426 "name": "BaseBdev3", 00:24:32.426 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:32.426 "is_configured": true, 00:24:32.426 "data_offset": 0, 00:24:32.426 "data_size": 65536 00:24:32.426 }, 00:24:32.426 { 00:24:32.426 "name": "BaseBdev4", 00:24:32.426 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:32.426 "is_configured": true, 00:24:32.426 "data_offset": 0, 00:24:32.426 "data_size": 65536 00:24:32.426 } 00:24:32.426 ] 00:24:32.426 }' 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.426 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.364 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.625 [2024-07-12 16:00:53.899732] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:33.625 [2024-07-12 16:00:53.899776] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:33.625 [2024-07-12 16:00:53.899804] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:33.625 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.625 "name": "raid_bdev1", 00:24:33.625 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:33.625 "strip_size_kb": 0, 00:24:33.625 "state": "online", 00:24:33.625 "raid_level": "raid1", 00:24:33.625 "superblock": false, 00:24:33.625 "num_base_bdevs": 4, 00:24:33.625 "num_base_bdevs_discovered": 3, 00:24:33.625 "num_base_bdevs_operational": 3, 00:24:33.625 "base_bdevs_list": [ 00:24:33.625 { 00:24:33.625 "name": "spare", 00:24:33.625 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:33.625 "is_configured": true, 00:24:33.625 "data_offset": 0, 00:24:33.625 "data_size": 65536 00:24:33.625 }, 00:24:33.625 { 00:24:33.625 "name": null, 00:24:33.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.625 "is_configured": false, 00:24:33.625 "data_offset": 0, 00:24:33.625 "data_size": 65536 00:24:33.625 }, 00:24:33.625 { 00:24:33.625 "name": "BaseBdev3", 00:24:33.625 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:33.625 "is_configured": true, 00:24:33.625 "data_offset": 0, 00:24:33.625 "data_size": 65536 00:24:33.625 }, 00:24:33.625 { 00:24:33.625 "name": "BaseBdev4", 00:24:33.625 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:33.625 "is_configured": true, 00:24:33.625 "data_offset": 0, 00:24:33.625 "data_size": 65536 00:24:33.625 } 00:24:33.625 ] 00:24:33.625 }' 00:24:33.625 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.625 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.885 "name": "raid_bdev1", 00:24:33.885 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:33.885 "strip_size_kb": 0, 00:24:33.885 "state": "online", 00:24:33.885 "raid_level": "raid1", 00:24:33.885 "superblock": false, 00:24:33.885 "num_base_bdevs": 4, 00:24:33.885 "num_base_bdevs_discovered": 3, 00:24:33.885 "num_base_bdevs_operational": 3, 00:24:33.885 "base_bdevs_list": [ 00:24:33.885 { 00:24:33.885 "name": "spare", 00:24:33.885 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:33.885 "is_configured": true, 00:24:33.885 "data_offset": 0, 00:24:33.885 "data_size": 65536 00:24:33.885 }, 00:24:33.885 { 00:24:33.885 "name": null, 00:24:33.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.885 "is_configured": false, 00:24:33.885 "data_offset": 0, 00:24:33.885 "data_size": 65536 00:24:33.885 }, 00:24:33.885 { 00:24:33.885 "name": "BaseBdev3", 00:24:33.885 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:33.885 "is_configured": true, 00:24:33.885 "data_offset": 0, 00:24:33.885 "data_size": 65536 00:24:33.885 }, 00:24:33.885 { 00:24:33.885 "name": "BaseBdev4", 00:24:33.885 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:33.885 "is_configured": true, 00:24:33.885 "data_offset": 0, 00:24:33.885 "data_size": 65536 00:24:33.885 } 00:24:33.885 ] 00:24:33.885 }' 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.885 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.145 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:34.145 "name": "raid_bdev1", 00:24:34.145 "uuid": "cea7e86a-ccef-49ec-8e7b-929157b50215", 00:24:34.145 "strip_size_kb": 0, 00:24:34.145 "state": "online", 00:24:34.145 "raid_level": "raid1", 00:24:34.145 "superblock": false, 00:24:34.145 "num_base_bdevs": 4, 00:24:34.145 "num_base_bdevs_discovered": 3, 00:24:34.145 "num_base_bdevs_operational": 3, 00:24:34.145 "base_bdevs_list": [ 00:24:34.145 { 00:24:34.145 "name": "spare", 00:24:34.145 "uuid": "dd676f31-8d51-5f80-b25c-fb59cf6e08e3", 00:24:34.145 "is_configured": true, 00:24:34.145 "data_offset": 0, 00:24:34.145 "data_size": 65536 00:24:34.145 }, 00:24:34.145 { 00:24:34.145 "name": null, 00:24:34.145 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.145 "is_configured": false, 00:24:34.145 "data_offset": 0, 00:24:34.145 "data_size": 65536 00:24:34.145 }, 00:24:34.145 { 00:24:34.145 "name": "BaseBdev3", 00:24:34.145 "uuid": "ba16f303-0ad0-5646-a2c1-1a2a98d5cd40", 00:24:34.145 "is_configured": true, 00:24:34.145 "data_offset": 0, 00:24:34.145 "data_size": 65536 00:24:34.145 }, 00:24:34.145 { 00:24:34.145 "name": "BaseBdev4", 00:24:34.145 "uuid": "317d70b6-8c94-5070-9f10-af168319eaad", 00:24:34.145 "is_configured": true, 00:24:34.145 "data_offset": 0, 00:24:34.145 "data_size": 65536 00:24:34.145 } 00:24:34.145 ] 00:24:34.145 }' 00:24:34.145 16:00:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:34.145 16:00:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:34.712 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:34.971 [2024-07-12 16:00:55.201760] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:34.971 [2024-07-12 16:00:55.201776] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:34.971 [2024-07-12 16:00:55.201820] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.971 [2024-07-12 16:00:55.201871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.971 [2024-07-12 16:00:55.201877] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd235b0 name raid_bdev1, state offline 00:24:34.971 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.971 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:35.540 /dev/nbd0 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.540 1+0 records in 00:24:35.540 1+0 records out 00:24:35.540 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295647 s, 13.9 MB/s 00:24:35.540 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.800 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:35.801 16:00:55 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:35.801 /dev/nbd1 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.801 1+0 records in 00:24:35.801 1+0 records out 00:24:35.801 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336931 s, 12.2 MB/s 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:35.801 16:00:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:36.061 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2638216 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 2638216 ']' 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 2638216 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2638216 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2638216' 00:24:36.321 killing process with pid 2638216 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 2638216 00:24:36.321 Received shutdown signal, test time was about 60.000000 seconds 00:24:36.321 00:24:36.321 Latency(us) 00:24:36.321 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:36.321 =================================================================================================================== 00:24:36.321 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:36.321 [2024-07-12 16:00:56.742437] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:36.321 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 2638216 00:24:36.582 [2024-07-12 16:00:56.768626] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:36.582 00:24:36.582 real 0m24.848s 00:24:36.582 user 0m32.560s 00:24:36.582 sys 0m4.445s 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:36.582 ************************************ 00:24:36.582 END TEST raid_rebuild_test 00:24:36.582 ************************************ 00:24:36.582 16:00:56 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:36.582 16:00:56 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:36.582 16:00:56 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:36.582 16:00:56 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:36.582 16:00:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:36.582 ************************************ 00:24:36.582 START TEST raid_rebuild_test_sb 00:24:36.582 ************************************ 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:36.582 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2642447 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2642447 /var/tmp/spdk-raid.sock 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 2642447 ']' 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:36.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:36.583 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:36.843 [2024-07-12 16:00:57.042757] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:24:36.843 [2024-07-12 16:00:57.042818] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2642447 ] 00:24:36.843 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:36.843 Zero copy mechanism will not be used. 00:24:36.843 [2024-07-12 16:00:57.137407] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:36.843 [2024-07-12 16:00:57.212274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:36.844 [2024-07-12 16:00:57.254701] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:36.844 [2024-07-12 16:00:57.254732] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.783 16:00:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:37.783 16:00:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:37.783 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:37.783 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:37.783 BaseBdev1_malloc 00:24:37.783 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:38.044 [2024-07-12 16:00:58.233144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:38.044 [2024-07-12 16:00:58.233180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.044 [2024-07-12 16:00:58.233195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a4b010 00:24:38.044 [2024-07-12 16:00:58.233201] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.044 [2024-07-12 16:00:58.234461] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.044 [2024-07-12 16:00:58.234480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:38.044 BaseBdev1 00:24:38.044 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.044 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:38.044 BaseBdev2_malloc 00:24:38.044 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:38.304 [2024-07-12 16:00:58.603896] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:38.304 [2024-07-12 16:00:58.603922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.304 [2024-07-12 16:00:58.603937] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a4bc30 00:24:38.304 [2024-07-12 16:00:58.603944] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.304 [2024-07-12 16:00:58.605081] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.304 [2024-07-12 16:00:58.605099] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:38.304 BaseBdev2 00:24:38.304 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.304 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:38.563 BaseBdev3_malloc 00:24:38.563 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:38.563 [2024-07-12 16:00:58.982549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:38.563 [2024-07-12 16:00:58.982576] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.563 [2024-07-12 16:00:58.982586] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bf32c0 00:24:38.563 [2024-07-12 16:00:58.982592] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.563 [2024-07-12 16:00:58.983734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.563 [2024-07-12 16:00:58.983751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:38.563 BaseBdev3 00:24:38.563 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.563 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:38.823 BaseBdev4_malloc 00:24:38.823 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:39.082 [2024-07-12 16:00:59.361213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:39.082 [2024-07-12 16:00:59.361238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.082 [2024-07-12 16:00:59.361249] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a434e0 00:24:39.082 [2024-07-12 16:00:59.361255] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.082 [2024-07-12 16:00:59.362393] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.082 [2024-07-12 16:00:59.362411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:39.082 BaseBdev4 00:24:39.082 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:39.342 spare_malloc 00:24:39.342 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:39.342 spare_delay 00:24:39.342 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:39.601 [2024-07-12 16:00:59.932241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:39.601 [2024-07-12 16:00:59.932263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.601 [2024-07-12 16:00:59.932273] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a41c80 00:24:39.601 [2024-07-12 16:00:59.932279] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.601 [2024-07-12 16:00:59.933425] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.601 [2024-07-12 16:00:59.933447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:39.601 spare 00:24:39.601 16:00:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:39.862 [2024-07-12 16:01:00.116763] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:39.862 [2024-07-12 16:01:00.117796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:39.862 [2024-07-12 16:01:00.117836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:39.862 [2024-07-12 16:01:00.117870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:39.862 [2024-07-12 16:01:00.118016] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a445b0 00:24:39.862 [2024-07-12 16:01:00.118023] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:39.862 [2024-07-12 16:01:00.118179] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf02a0 00:24:39.862 [2024-07-12 16:01:00.118296] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a445b0 00:24:39.862 [2024-07-12 16:01:00.118302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a445b0 00:24:39.862 [2024-07-12 16:01:00.118374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.862 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:40.122 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:40.122 "name": "raid_bdev1", 00:24:40.122 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:40.122 "strip_size_kb": 0, 00:24:40.122 "state": "online", 00:24:40.122 "raid_level": "raid1", 00:24:40.122 "superblock": true, 00:24:40.122 "num_base_bdevs": 4, 00:24:40.122 "num_base_bdevs_discovered": 4, 00:24:40.122 "num_base_bdevs_operational": 4, 00:24:40.122 "base_bdevs_list": [ 00:24:40.122 { 00:24:40.122 "name": "BaseBdev1", 00:24:40.122 "uuid": "892d0e80-3b62-50ce-9ce9-6891756e2870", 00:24:40.122 "is_configured": true, 00:24:40.122 "data_offset": 2048, 00:24:40.122 "data_size": 63488 00:24:40.122 }, 00:24:40.122 { 00:24:40.122 "name": "BaseBdev2", 00:24:40.122 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:40.122 "is_configured": true, 00:24:40.122 "data_offset": 2048, 00:24:40.122 "data_size": 63488 00:24:40.122 }, 00:24:40.122 { 00:24:40.122 "name": "BaseBdev3", 00:24:40.122 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:40.122 "is_configured": true, 00:24:40.122 "data_offset": 2048, 00:24:40.122 "data_size": 63488 00:24:40.122 }, 00:24:40.122 { 00:24:40.122 "name": "BaseBdev4", 00:24:40.122 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:40.122 "is_configured": true, 00:24:40.122 "data_offset": 2048, 00:24:40.122 "data_size": 63488 00:24:40.122 } 00:24:40.122 ] 00:24:40.122 }' 00:24:40.122 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:40.122 16:01:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:40.691 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:40.691 16:01:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:40.952 [2024-07-12 16:01:01.368130] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:41.216 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:41.478 [2024-07-12 16:01:01.772938] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a499e0 00:24:41.478 /dev/nbd0 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:41.478 1+0 records in 00:24:41.478 1+0 records out 00:24:41.478 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278935 s, 14.7 MB/s 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:41.478 16:01:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:51.465 63488+0 records in 00:24:51.465 63488+0 records out 00:24:51.465 32505856 bytes (33 MB, 31 MiB) copied, 9.17539 s, 3.5 MB/s 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:51.465 [2024-07-12 16:01:11.202773] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:51.465 [2024-07-12 16:01:11.380133] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.465 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.465 "name": "raid_bdev1", 00:24:51.465 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:51.465 "strip_size_kb": 0, 00:24:51.465 "state": "online", 00:24:51.465 "raid_level": "raid1", 00:24:51.465 "superblock": true, 00:24:51.465 "num_base_bdevs": 4, 00:24:51.465 "num_base_bdevs_discovered": 3, 00:24:51.465 "num_base_bdevs_operational": 3, 00:24:51.465 "base_bdevs_list": [ 00:24:51.465 { 00:24:51.465 "name": null, 00:24:51.465 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.465 "is_configured": false, 00:24:51.465 "data_offset": 2048, 00:24:51.465 "data_size": 63488 00:24:51.465 }, 00:24:51.465 { 00:24:51.465 "name": "BaseBdev2", 00:24:51.465 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:51.466 "is_configured": true, 00:24:51.466 "data_offset": 2048, 00:24:51.466 "data_size": 63488 00:24:51.466 }, 00:24:51.466 { 00:24:51.466 "name": "BaseBdev3", 00:24:51.466 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:51.466 "is_configured": true, 00:24:51.466 "data_offset": 2048, 00:24:51.466 "data_size": 63488 00:24:51.466 }, 00:24:51.466 { 00:24:51.466 "name": "BaseBdev4", 00:24:51.466 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:51.466 "is_configured": true, 00:24:51.466 "data_offset": 2048, 00:24:51.466 "data_size": 63488 00:24:51.466 } 00:24:51.466 ] 00:24:51.466 }' 00:24:51.466 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.466 16:01:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:51.726 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:51.986 [2024-07-12 16:01:12.314498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:51.986 [2024-07-12 16:01:12.317392] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a49e50 00:24:51.986 [2024-07-12 16:01:12.318957] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:51.986 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.928 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.189 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:53.189 "name": "raid_bdev1", 00:24:53.189 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:53.189 "strip_size_kb": 0, 00:24:53.189 "state": "online", 00:24:53.189 "raid_level": "raid1", 00:24:53.189 "superblock": true, 00:24:53.189 "num_base_bdevs": 4, 00:24:53.189 "num_base_bdevs_discovered": 4, 00:24:53.189 "num_base_bdevs_operational": 4, 00:24:53.189 "process": { 00:24:53.189 "type": "rebuild", 00:24:53.189 "target": "spare", 00:24:53.189 "progress": { 00:24:53.189 "blocks": 22528, 00:24:53.189 "percent": 35 00:24:53.189 } 00:24:53.189 }, 00:24:53.189 "base_bdevs_list": [ 00:24:53.189 { 00:24:53.189 "name": "spare", 00:24:53.189 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:53.189 "is_configured": true, 00:24:53.189 "data_offset": 2048, 00:24:53.189 "data_size": 63488 00:24:53.189 }, 00:24:53.189 { 00:24:53.189 "name": "BaseBdev2", 00:24:53.189 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:53.189 "is_configured": true, 00:24:53.189 "data_offset": 2048, 00:24:53.189 "data_size": 63488 00:24:53.189 }, 00:24:53.189 { 00:24:53.189 "name": "BaseBdev3", 00:24:53.189 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:53.189 "is_configured": true, 00:24:53.189 "data_offset": 2048, 00:24:53.189 "data_size": 63488 00:24:53.189 }, 00:24:53.190 { 00:24:53.190 "name": "BaseBdev4", 00:24:53.190 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:53.190 "is_configured": true, 00:24:53.190 "data_offset": 2048, 00:24:53.190 "data_size": 63488 00:24:53.190 } 00:24:53.190 ] 00:24:53.190 }' 00:24:53.190 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:53.190 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:53.190 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:53.190 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:53.190 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:53.450 [2024-07-12 16:01:13.791433] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:53.450 [2024-07-12 16:01:13.827805] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:53.450 [2024-07-12 16:01:13.827837] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:53.450 [2024-07-12 16:01:13.827848] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:53.450 [2024-07-12 16:01:13.827853] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.450 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.710 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.710 "name": "raid_bdev1", 00:24:53.710 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:53.710 "strip_size_kb": 0, 00:24:53.710 "state": "online", 00:24:53.710 "raid_level": "raid1", 00:24:53.710 "superblock": true, 00:24:53.710 "num_base_bdevs": 4, 00:24:53.710 "num_base_bdevs_discovered": 3, 00:24:53.710 "num_base_bdevs_operational": 3, 00:24:53.710 "base_bdevs_list": [ 00:24:53.710 { 00:24:53.710 "name": null, 00:24:53.710 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:53.710 "is_configured": false, 00:24:53.710 "data_offset": 2048, 00:24:53.710 "data_size": 63488 00:24:53.710 }, 00:24:53.710 { 00:24:53.710 "name": "BaseBdev2", 00:24:53.710 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:53.710 "is_configured": true, 00:24:53.710 "data_offset": 2048, 00:24:53.710 "data_size": 63488 00:24:53.710 }, 00:24:53.710 { 00:24:53.710 "name": "BaseBdev3", 00:24:53.710 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:53.710 "is_configured": true, 00:24:53.710 "data_offset": 2048, 00:24:53.710 "data_size": 63488 00:24:53.710 }, 00:24:53.710 { 00:24:53.710 "name": "BaseBdev4", 00:24:53.710 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:53.710 "is_configured": true, 00:24:53.710 "data_offset": 2048, 00:24:53.710 "data_size": 63488 00:24:53.710 } 00:24:53.710 ] 00:24:53.710 }' 00:24:53.710 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.710 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.280 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:54.540 "name": "raid_bdev1", 00:24:54.540 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:54.540 "strip_size_kb": 0, 00:24:54.540 "state": "online", 00:24:54.540 "raid_level": "raid1", 00:24:54.540 "superblock": true, 00:24:54.540 "num_base_bdevs": 4, 00:24:54.540 "num_base_bdevs_discovered": 3, 00:24:54.540 "num_base_bdevs_operational": 3, 00:24:54.540 "base_bdevs_list": [ 00:24:54.540 { 00:24:54.540 "name": null, 00:24:54.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.540 "is_configured": false, 00:24:54.540 "data_offset": 2048, 00:24:54.540 "data_size": 63488 00:24:54.540 }, 00:24:54.540 { 00:24:54.540 "name": "BaseBdev2", 00:24:54.540 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:54.540 "is_configured": true, 00:24:54.540 "data_offset": 2048, 00:24:54.540 "data_size": 63488 00:24:54.540 }, 00:24:54.540 { 00:24:54.540 "name": "BaseBdev3", 00:24:54.540 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:54.540 "is_configured": true, 00:24:54.540 "data_offset": 2048, 00:24:54.540 "data_size": 63488 00:24:54.540 }, 00:24:54.540 { 00:24:54.540 "name": "BaseBdev4", 00:24:54.540 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:54.540 "is_configured": true, 00:24:54.540 "data_offset": 2048, 00:24:54.540 "data_size": 63488 00:24:54.540 } 00:24:54.540 ] 00:24:54.540 }' 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:54.540 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:54.800 [2024-07-12 16:01:15.010535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.800 [2024-07-12 16:01:15.013269] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf0430 00:24:54.800 [2024-07-12 16:01:15.014432] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:54.800 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:55.811 "name": "raid_bdev1", 00:24:55.811 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:55.811 "strip_size_kb": 0, 00:24:55.811 "state": "online", 00:24:55.811 "raid_level": "raid1", 00:24:55.811 "superblock": true, 00:24:55.811 "num_base_bdevs": 4, 00:24:55.811 "num_base_bdevs_discovered": 4, 00:24:55.811 "num_base_bdevs_operational": 4, 00:24:55.811 "process": { 00:24:55.811 "type": "rebuild", 00:24:55.811 "target": "spare", 00:24:55.811 "progress": { 00:24:55.811 "blocks": 22528, 00:24:55.811 "percent": 35 00:24:55.811 } 00:24:55.811 }, 00:24:55.811 "base_bdevs_list": [ 00:24:55.811 { 00:24:55.811 "name": "spare", 00:24:55.811 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:55.811 "is_configured": true, 00:24:55.811 "data_offset": 2048, 00:24:55.811 "data_size": 63488 00:24:55.811 }, 00:24:55.811 { 00:24:55.811 "name": "BaseBdev2", 00:24:55.811 "uuid": "f7252a7b-bc38-57c2-b2e7-ab652270b206", 00:24:55.811 "is_configured": true, 00:24:55.811 "data_offset": 2048, 00:24:55.811 "data_size": 63488 00:24:55.811 }, 00:24:55.811 { 00:24:55.811 "name": "BaseBdev3", 00:24:55.811 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:55.811 "is_configured": true, 00:24:55.811 "data_offset": 2048, 00:24:55.811 "data_size": 63488 00:24:55.811 }, 00:24:55.811 { 00:24:55.811 "name": "BaseBdev4", 00:24:55.811 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:55.811 "is_configured": true, 00:24:55.811 "data_offset": 2048, 00:24:55.811 "data_size": 63488 00:24:55.811 } 00:24:55.811 ] 00:24:55.811 }' 00:24:55.811 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:56.071 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:56.071 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:56.071 [2024-07-12 16:01:16.491248] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:56.331 [2024-07-12 16:01:16.623571] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1bf0430 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.331 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.591 "name": "raid_bdev1", 00:24:56.591 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:56.591 "strip_size_kb": 0, 00:24:56.591 "state": "online", 00:24:56.591 "raid_level": "raid1", 00:24:56.591 "superblock": true, 00:24:56.591 "num_base_bdevs": 4, 00:24:56.591 "num_base_bdevs_discovered": 3, 00:24:56.591 "num_base_bdevs_operational": 3, 00:24:56.591 "process": { 00:24:56.591 "type": "rebuild", 00:24:56.591 "target": "spare", 00:24:56.591 "progress": { 00:24:56.591 "blocks": 34816, 00:24:56.591 "percent": 54 00:24:56.591 } 00:24:56.591 }, 00:24:56.591 "base_bdevs_list": [ 00:24:56.591 { 00:24:56.591 "name": "spare", 00:24:56.591 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:56.591 "is_configured": true, 00:24:56.591 "data_offset": 2048, 00:24:56.591 "data_size": 63488 00:24:56.591 }, 00:24:56.591 { 00:24:56.591 "name": null, 00:24:56.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.591 "is_configured": false, 00:24:56.591 "data_offset": 2048, 00:24:56.591 "data_size": 63488 00:24:56.591 }, 00:24:56.591 { 00:24:56.591 "name": "BaseBdev3", 00:24:56.591 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:56.591 "is_configured": true, 00:24:56.591 "data_offset": 2048, 00:24:56.591 "data_size": 63488 00:24:56.591 }, 00:24:56.591 { 00:24:56.591 "name": "BaseBdev4", 00:24:56.591 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:56.591 "is_configured": true, 00:24:56.591 "data_offset": 2048, 00:24:56.591 "data_size": 63488 00:24:56.591 } 00:24:56.591 ] 00:24:56.591 }' 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=822 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.591 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.851 "name": "raid_bdev1", 00:24:56.851 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:56.851 "strip_size_kb": 0, 00:24:56.851 "state": "online", 00:24:56.851 "raid_level": "raid1", 00:24:56.851 "superblock": true, 00:24:56.851 "num_base_bdevs": 4, 00:24:56.851 "num_base_bdevs_discovered": 3, 00:24:56.851 "num_base_bdevs_operational": 3, 00:24:56.851 "process": { 00:24:56.851 "type": "rebuild", 00:24:56.851 "target": "spare", 00:24:56.851 "progress": { 00:24:56.851 "blocks": 38912, 00:24:56.851 "percent": 61 00:24:56.851 } 00:24:56.851 }, 00:24:56.851 "base_bdevs_list": [ 00:24:56.851 { 00:24:56.851 "name": "spare", 00:24:56.851 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:56.851 "is_configured": true, 00:24:56.851 "data_offset": 2048, 00:24:56.851 "data_size": 63488 00:24:56.851 }, 00:24:56.851 { 00:24:56.851 "name": null, 00:24:56.851 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.851 "is_configured": false, 00:24:56.851 "data_offset": 2048, 00:24:56.851 "data_size": 63488 00:24:56.851 }, 00:24:56.851 { 00:24:56.851 "name": "BaseBdev3", 00:24:56.851 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:56.851 "is_configured": true, 00:24:56.851 "data_offset": 2048, 00:24:56.851 "data_size": 63488 00:24:56.851 }, 00:24:56.851 { 00:24:56.851 "name": "BaseBdev4", 00:24:56.851 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:56.851 "is_configured": true, 00:24:56.851 "data_offset": 2048, 00:24:56.851 "data_size": 63488 00:24:56.851 } 00:24:56.851 ] 00:24:56.851 }' 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.851 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:57.791 [2024-07-12 16:01:18.232987] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:57.791 [2024-07-12 16:01:18.233036] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:57.792 [2024-07-12 16:01:18.233112] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:57.792 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.052 "name": "raid_bdev1", 00:24:58.052 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:58.052 "strip_size_kb": 0, 00:24:58.052 "state": "online", 00:24:58.052 "raid_level": "raid1", 00:24:58.052 "superblock": true, 00:24:58.052 "num_base_bdevs": 4, 00:24:58.052 "num_base_bdevs_discovered": 3, 00:24:58.052 "num_base_bdevs_operational": 3, 00:24:58.052 "base_bdevs_list": [ 00:24:58.052 { 00:24:58.052 "name": "spare", 00:24:58.052 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:58.052 "is_configured": true, 00:24:58.052 "data_offset": 2048, 00:24:58.052 "data_size": 63488 00:24:58.052 }, 00:24:58.052 { 00:24:58.052 "name": null, 00:24:58.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.052 "is_configured": false, 00:24:58.052 "data_offset": 2048, 00:24:58.052 "data_size": 63488 00:24:58.052 }, 00:24:58.052 { 00:24:58.052 "name": "BaseBdev3", 00:24:58.052 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:58.052 "is_configured": true, 00:24:58.052 "data_offset": 2048, 00:24:58.052 "data_size": 63488 00:24:58.052 }, 00:24:58.052 { 00:24:58.052 "name": "BaseBdev4", 00:24:58.052 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:58.052 "is_configured": true, 00:24:58.052 "data_offset": 2048, 00:24:58.052 "data_size": 63488 00:24:58.052 } 00:24:58.052 ] 00:24:58.052 }' 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:58.052 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.313 "name": "raid_bdev1", 00:24:58.313 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:58.313 "strip_size_kb": 0, 00:24:58.313 "state": "online", 00:24:58.313 "raid_level": "raid1", 00:24:58.313 "superblock": true, 00:24:58.313 "num_base_bdevs": 4, 00:24:58.313 "num_base_bdevs_discovered": 3, 00:24:58.313 "num_base_bdevs_operational": 3, 00:24:58.313 "base_bdevs_list": [ 00:24:58.313 { 00:24:58.313 "name": "spare", 00:24:58.313 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:58.313 "is_configured": true, 00:24:58.313 "data_offset": 2048, 00:24:58.313 "data_size": 63488 00:24:58.313 }, 00:24:58.313 { 00:24:58.313 "name": null, 00:24:58.313 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.313 "is_configured": false, 00:24:58.313 "data_offset": 2048, 00:24:58.313 "data_size": 63488 00:24:58.313 }, 00:24:58.313 { 00:24:58.313 "name": "BaseBdev3", 00:24:58.313 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:58.313 "is_configured": true, 00:24:58.313 "data_offset": 2048, 00:24:58.313 "data_size": 63488 00:24:58.313 }, 00:24:58.313 { 00:24:58.313 "name": "BaseBdev4", 00:24:58.313 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:58.313 "is_configured": true, 00:24:58.313 "data_offset": 2048, 00:24:58.313 "data_size": 63488 00:24:58.313 } 00:24:58.313 ] 00:24:58.313 }' 00:24:58.313 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:58.574 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.574 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:58.574 "name": "raid_bdev1", 00:24:58.574 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:24:58.574 "strip_size_kb": 0, 00:24:58.574 "state": "online", 00:24:58.574 "raid_level": "raid1", 00:24:58.574 "superblock": true, 00:24:58.574 "num_base_bdevs": 4, 00:24:58.574 "num_base_bdevs_discovered": 3, 00:24:58.574 "num_base_bdevs_operational": 3, 00:24:58.574 "base_bdevs_list": [ 00:24:58.574 { 00:24:58.574 "name": "spare", 00:24:58.574 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:24:58.574 "is_configured": true, 00:24:58.574 "data_offset": 2048, 00:24:58.574 "data_size": 63488 00:24:58.574 }, 00:24:58.574 { 00:24:58.574 "name": null, 00:24:58.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.574 "is_configured": false, 00:24:58.574 "data_offset": 2048, 00:24:58.574 "data_size": 63488 00:24:58.574 }, 00:24:58.574 { 00:24:58.574 "name": "BaseBdev3", 00:24:58.574 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:24:58.574 "is_configured": true, 00:24:58.574 "data_offset": 2048, 00:24:58.574 "data_size": 63488 00:24:58.574 }, 00:24:58.574 { 00:24:58.574 "name": "BaseBdev4", 00:24:58.574 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:24:58.574 "is_configured": true, 00:24:58.574 "data_offset": 2048, 00:24:58.574 "data_size": 63488 00:24:58.574 } 00:24:58.574 ] 00:24:58.574 }' 00:24:58.574 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:58.574 16:01:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:59.144 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:59.404 [2024-07-12 16:01:19.688822] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:59.404 [2024-07-12 16:01:19.688840] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:59.404 [2024-07-12 16:01:19.688885] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:59.404 [2024-07-12 16:01:19.688938] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:59.404 [2024-07-12 16:01:19.688944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a445b0 name raid_bdev1, state offline 00:24:59.404 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.404 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:59.663 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:59.663 /dev/nbd0 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:59.922 1+0 records in 00:24:59.922 1+0 records out 00:24:59.922 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032336 s, 12.7 MB/s 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:59.922 /dev/nbd1 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:59.922 1+0 records in 00:24:59.922 1+0 records out 00:24:59.922 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282213 s, 14.5 MB/s 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:59.922 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:00.182 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:00.442 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:00.702 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:00.963 [2024-07-12 16:01:21.182155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:00.963 [2024-07-12 16:01:21.182191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.963 [2024-07-12 16:01:21.182207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a4a770 00:25:00.963 [2024-07-12 16:01:21.182214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.963 [2024-07-12 16:01:21.183523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.963 [2024-07-12 16:01:21.183547] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:00.963 [2024-07-12 16:01:21.183608] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:00.963 [2024-07-12 16:01:21.183629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:00.963 [2024-07-12 16:01:21.183718] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:00.963 [2024-07-12 16:01:21.183783] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:00.963 spare 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.963 [2024-07-12 16:01:21.284074] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a42370 00:25:00.963 [2024-07-12 16:01:21.284082] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:00.963 [2024-07-12 16:01:21.284244] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf13b0 00:25:00.963 [2024-07-12 16:01:21.284362] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a42370 00:25:00.963 [2024-07-12 16:01:21.284368] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a42370 00:25:00.963 [2024-07-12 16:01:21.284445] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:00.963 "name": "raid_bdev1", 00:25:00.963 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:00.963 "strip_size_kb": 0, 00:25:00.963 "state": "online", 00:25:00.963 "raid_level": "raid1", 00:25:00.963 "superblock": true, 00:25:00.963 "num_base_bdevs": 4, 00:25:00.963 "num_base_bdevs_discovered": 3, 00:25:00.963 "num_base_bdevs_operational": 3, 00:25:00.963 "base_bdevs_list": [ 00:25:00.963 { 00:25:00.963 "name": "spare", 00:25:00.963 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:25:00.963 "is_configured": true, 00:25:00.963 "data_offset": 2048, 00:25:00.963 "data_size": 63488 00:25:00.963 }, 00:25:00.963 { 00:25:00.963 "name": null, 00:25:00.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.963 "is_configured": false, 00:25:00.963 "data_offset": 2048, 00:25:00.963 "data_size": 63488 00:25:00.963 }, 00:25:00.963 { 00:25:00.963 "name": "BaseBdev3", 00:25:00.963 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:00.963 "is_configured": true, 00:25:00.963 "data_offset": 2048, 00:25:00.963 "data_size": 63488 00:25:00.963 }, 00:25:00.963 { 00:25:00.963 "name": "BaseBdev4", 00:25:00.963 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:00.963 "is_configured": true, 00:25:00.963 "data_offset": 2048, 00:25:00.963 "data_size": 63488 00:25:00.963 } 00:25:00.963 ] 00:25:00.963 }' 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:00.963 16:01:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.534 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.794 "name": "raid_bdev1", 00:25:01.794 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:01.794 "strip_size_kb": 0, 00:25:01.794 "state": "online", 00:25:01.794 "raid_level": "raid1", 00:25:01.794 "superblock": true, 00:25:01.794 "num_base_bdevs": 4, 00:25:01.794 "num_base_bdevs_discovered": 3, 00:25:01.794 "num_base_bdevs_operational": 3, 00:25:01.794 "base_bdevs_list": [ 00:25:01.794 { 00:25:01.794 "name": "spare", 00:25:01.794 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:25:01.794 "is_configured": true, 00:25:01.794 "data_offset": 2048, 00:25:01.794 "data_size": 63488 00:25:01.794 }, 00:25:01.794 { 00:25:01.794 "name": null, 00:25:01.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.794 "is_configured": false, 00:25:01.794 "data_offset": 2048, 00:25:01.794 "data_size": 63488 00:25:01.794 }, 00:25:01.794 { 00:25:01.794 "name": "BaseBdev3", 00:25:01.794 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:01.794 "is_configured": true, 00:25:01.794 "data_offset": 2048, 00:25:01.794 "data_size": 63488 00:25:01.794 }, 00:25:01.794 { 00:25:01.794 "name": "BaseBdev4", 00:25:01.794 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:01.794 "is_configured": true, 00:25:01.794 "data_offset": 2048, 00:25:01.794 "data_size": 63488 00:25:01.794 } 00:25:01.794 ] 00:25:01.794 }' 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.794 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:02.054 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:02.054 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:02.314 [2024-07-12 16:01:22.581760] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.314 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.579 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.579 "name": "raid_bdev1", 00:25:02.579 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:02.579 "strip_size_kb": 0, 00:25:02.579 "state": "online", 00:25:02.579 "raid_level": "raid1", 00:25:02.579 "superblock": true, 00:25:02.579 "num_base_bdevs": 4, 00:25:02.579 "num_base_bdevs_discovered": 2, 00:25:02.579 "num_base_bdevs_operational": 2, 00:25:02.579 "base_bdevs_list": [ 00:25:02.579 { 00:25:02.579 "name": null, 00:25:02.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.579 "is_configured": false, 00:25:02.579 "data_offset": 2048, 00:25:02.579 "data_size": 63488 00:25:02.579 }, 00:25:02.579 { 00:25:02.579 "name": null, 00:25:02.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.579 "is_configured": false, 00:25:02.579 "data_offset": 2048, 00:25:02.579 "data_size": 63488 00:25:02.579 }, 00:25:02.579 { 00:25:02.579 "name": "BaseBdev3", 00:25:02.579 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:02.579 "is_configured": true, 00:25:02.579 "data_offset": 2048, 00:25:02.579 "data_size": 63488 00:25:02.579 }, 00:25:02.579 { 00:25:02.579 "name": "BaseBdev4", 00:25:02.579 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:02.579 "is_configured": true, 00:25:02.579 "data_offset": 2048, 00:25:02.579 "data_size": 63488 00:25:02.579 } 00:25:02.579 ] 00:25:02.579 }' 00:25:02.579 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.579 16:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:03.149 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:03.149 [2024-07-12 16:01:23.500104] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:03.149 [2024-07-12 16:01:23.500221] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:03.149 [2024-07-12 16:01:23.500230] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:03.149 [2024-07-12 16:01:23.500249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:03.149 [2024-07-12 16:01:23.503048] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bf13b0 00:25:03.149 [2024-07-12 16:01:23.504129] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:03.149 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.089 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:04.349 "name": "raid_bdev1", 00:25:04.349 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:04.349 "strip_size_kb": 0, 00:25:04.349 "state": "online", 00:25:04.349 "raid_level": "raid1", 00:25:04.349 "superblock": true, 00:25:04.349 "num_base_bdevs": 4, 00:25:04.349 "num_base_bdevs_discovered": 3, 00:25:04.349 "num_base_bdevs_operational": 3, 00:25:04.349 "process": { 00:25:04.349 "type": "rebuild", 00:25:04.349 "target": "spare", 00:25:04.349 "progress": { 00:25:04.349 "blocks": 22528, 00:25:04.349 "percent": 35 00:25:04.349 } 00:25:04.349 }, 00:25:04.349 "base_bdevs_list": [ 00:25:04.349 { 00:25:04.349 "name": "spare", 00:25:04.349 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:25:04.349 "is_configured": true, 00:25:04.349 "data_offset": 2048, 00:25:04.349 "data_size": 63488 00:25:04.349 }, 00:25:04.349 { 00:25:04.349 "name": null, 00:25:04.349 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.349 "is_configured": false, 00:25:04.349 "data_offset": 2048, 00:25:04.349 "data_size": 63488 00:25:04.349 }, 00:25:04.349 { 00:25:04.349 "name": "BaseBdev3", 00:25:04.349 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:04.349 "is_configured": true, 00:25:04.349 "data_offset": 2048, 00:25:04.349 "data_size": 63488 00:25:04.349 }, 00:25:04.349 { 00:25:04.349 "name": "BaseBdev4", 00:25:04.349 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:04.349 "is_configured": true, 00:25:04.349 "data_offset": 2048, 00:25:04.349 "data_size": 63488 00:25:04.349 } 00:25:04.349 ] 00:25:04.349 }' 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:04.349 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:04.609 [2024-07-12 16:01:24.964651] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:04.609 [2024-07-12 16:01:25.013033] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:04.609 [2024-07-12 16:01:25.013066] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.609 [2024-07-12 16:01:25.013077] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:04.609 [2024-07-12 16:01:25.013082] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.609 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.610 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.610 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.610 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.610 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.890 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.890 "name": "raid_bdev1", 00:25:04.890 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:04.890 "strip_size_kb": 0, 00:25:04.890 "state": "online", 00:25:04.890 "raid_level": "raid1", 00:25:04.890 "superblock": true, 00:25:04.890 "num_base_bdevs": 4, 00:25:04.890 "num_base_bdevs_discovered": 2, 00:25:04.890 "num_base_bdevs_operational": 2, 00:25:04.890 "base_bdevs_list": [ 00:25:04.890 { 00:25:04.890 "name": null, 00:25:04.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.890 "is_configured": false, 00:25:04.890 "data_offset": 2048, 00:25:04.890 "data_size": 63488 00:25:04.890 }, 00:25:04.890 { 00:25:04.890 "name": null, 00:25:04.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.890 "is_configured": false, 00:25:04.890 "data_offset": 2048, 00:25:04.890 "data_size": 63488 00:25:04.890 }, 00:25:04.890 { 00:25:04.890 "name": "BaseBdev3", 00:25:04.890 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:04.890 "is_configured": true, 00:25:04.890 "data_offset": 2048, 00:25:04.890 "data_size": 63488 00:25:04.890 }, 00:25:04.890 { 00:25:04.890 "name": "BaseBdev4", 00:25:04.890 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:04.890 "is_configured": true, 00:25:04.890 "data_offset": 2048, 00:25:04.890 "data_size": 63488 00:25:04.890 } 00:25:04.890 ] 00:25:04.890 }' 00:25:04.890 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.890 16:01:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:05.459 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:05.719 [2024-07-12 16:01:25.935091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:05.719 [2024-07-12 16:01:25.935127] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:05.719 [2024-07-12 16:01:25.935142] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a427e0 00:25:05.719 [2024-07-12 16:01:25.935148] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:05.719 [2024-07-12 16:01:25.935457] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:05.719 [2024-07-12 16:01:25.935468] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:05.719 [2024-07-12 16:01:25.935531] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:05.719 [2024-07-12 16:01:25.935539] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:05.719 [2024-07-12 16:01:25.935544] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:05.719 [2024-07-12 16:01:25.935556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.719 [2024-07-12 16:01:25.938278] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a481d0 00:25:05.719 [2024-07-12 16:01:25.939359] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.719 spare 00:25:05.719 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.658 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.917 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.917 "name": "raid_bdev1", 00:25:06.917 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:06.917 "strip_size_kb": 0, 00:25:06.917 "state": "online", 00:25:06.917 "raid_level": "raid1", 00:25:06.917 "superblock": true, 00:25:06.917 "num_base_bdevs": 4, 00:25:06.917 "num_base_bdevs_discovered": 3, 00:25:06.917 "num_base_bdevs_operational": 3, 00:25:06.917 "process": { 00:25:06.917 "type": "rebuild", 00:25:06.917 "target": "spare", 00:25:06.917 "progress": { 00:25:06.917 "blocks": 24576, 00:25:06.917 "percent": 38 00:25:06.917 } 00:25:06.917 }, 00:25:06.917 "base_bdevs_list": [ 00:25:06.917 { 00:25:06.917 "name": "spare", 00:25:06.917 "uuid": "cd03c2a4-a64e-5a03-a8ec-7e7719a062a1", 00:25:06.917 "is_configured": true, 00:25:06.917 "data_offset": 2048, 00:25:06.917 "data_size": 63488 00:25:06.917 }, 00:25:06.917 { 00:25:06.917 "name": null, 00:25:06.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:06.917 "is_configured": false, 00:25:06.917 "data_offset": 2048, 00:25:06.918 "data_size": 63488 00:25:06.918 }, 00:25:06.918 { 00:25:06.918 "name": "BaseBdev3", 00:25:06.918 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:06.918 "is_configured": true, 00:25:06.918 "data_offset": 2048, 00:25:06.918 "data_size": 63488 00:25:06.918 }, 00:25:06.918 { 00:25:06.918 "name": "BaseBdev4", 00:25:06.918 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:06.918 "is_configured": true, 00:25:06.918 "data_offset": 2048, 00:25:06.918 "data_size": 63488 00:25:06.918 } 00:25:06.918 ] 00:25:06.918 }' 00:25:06.918 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.918 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.918 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.918 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.918 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:07.177 [2024-07-12 16:01:27.428154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.177 [2024-07-12 16:01:27.448167] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:07.177 [2024-07-12 16:01:27.448197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:07.177 [2024-07-12 16:01:27.448207] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:07.177 [2024-07-12 16:01:27.448211] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.177 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.437 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.437 "name": "raid_bdev1", 00:25:07.437 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:07.437 "strip_size_kb": 0, 00:25:07.437 "state": "online", 00:25:07.437 "raid_level": "raid1", 00:25:07.437 "superblock": true, 00:25:07.437 "num_base_bdevs": 4, 00:25:07.437 "num_base_bdevs_discovered": 2, 00:25:07.437 "num_base_bdevs_operational": 2, 00:25:07.437 "base_bdevs_list": [ 00:25:07.437 { 00:25:07.437 "name": null, 00:25:07.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.437 "is_configured": false, 00:25:07.437 "data_offset": 2048, 00:25:07.437 "data_size": 63488 00:25:07.437 }, 00:25:07.437 { 00:25:07.437 "name": null, 00:25:07.437 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.437 "is_configured": false, 00:25:07.437 "data_offset": 2048, 00:25:07.437 "data_size": 63488 00:25:07.437 }, 00:25:07.437 { 00:25:07.437 "name": "BaseBdev3", 00:25:07.437 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:07.437 "is_configured": true, 00:25:07.437 "data_offset": 2048, 00:25:07.437 "data_size": 63488 00:25:07.437 }, 00:25:07.437 { 00:25:07.437 "name": "BaseBdev4", 00:25:07.437 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:07.437 "is_configured": true, 00:25:07.437 "data_offset": 2048, 00:25:07.437 "data_size": 63488 00:25:07.437 } 00:25:07.437 ] 00:25:07.437 }' 00:25:07.437 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.437 16:01:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:08.005 "name": "raid_bdev1", 00:25:08.005 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:08.005 "strip_size_kb": 0, 00:25:08.005 "state": "online", 00:25:08.005 "raid_level": "raid1", 00:25:08.005 "superblock": true, 00:25:08.005 "num_base_bdevs": 4, 00:25:08.005 "num_base_bdevs_discovered": 2, 00:25:08.005 "num_base_bdevs_operational": 2, 00:25:08.005 "base_bdevs_list": [ 00:25:08.005 { 00:25:08.005 "name": null, 00:25:08.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.005 "is_configured": false, 00:25:08.005 "data_offset": 2048, 00:25:08.005 "data_size": 63488 00:25:08.005 }, 00:25:08.005 { 00:25:08.005 "name": null, 00:25:08.005 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.005 "is_configured": false, 00:25:08.005 "data_offset": 2048, 00:25:08.005 "data_size": 63488 00:25:08.005 }, 00:25:08.005 { 00:25:08.005 "name": "BaseBdev3", 00:25:08.005 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:08.005 "is_configured": true, 00:25:08.005 "data_offset": 2048, 00:25:08.005 "data_size": 63488 00:25:08.005 }, 00:25:08.005 { 00:25:08.005 "name": "BaseBdev4", 00:25:08.005 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:08.005 "is_configured": true, 00:25:08.005 "data_offset": 2048, 00:25:08.005 "data_size": 63488 00:25:08.005 } 00:25:08.005 ] 00:25:08.005 }' 00:25:08.005 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.265 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:08.265 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.265 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.265 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:08.265 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:08.525 [2024-07-12 16:01:28.871765] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:08.525 [2024-07-12 16:01:28.871797] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.525 [2024-07-12 16:01:28.871809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a4b240 00:25:08.526 [2024-07-12 16:01:28.871816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.526 [2024-07-12 16:01:28.872102] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.526 [2024-07-12 16:01:28.872114] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:08.526 [2024-07-12 16:01:28.872158] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:08.526 [2024-07-12 16:01:28.872166] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:08.526 [2024-07-12 16:01:28.872171] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:08.526 BaseBdev1 00:25:08.526 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.466 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.727 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:09.727 "name": "raid_bdev1", 00:25:09.727 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:09.727 "strip_size_kb": 0, 00:25:09.727 "state": "online", 00:25:09.727 "raid_level": "raid1", 00:25:09.727 "superblock": true, 00:25:09.727 "num_base_bdevs": 4, 00:25:09.727 "num_base_bdevs_discovered": 2, 00:25:09.727 "num_base_bdevs_operational": 2, 00:25:09.727 "base_bdevs_list": [ 00:25:09.727 { 00:25:09.727 "name": null, 00:25:09.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.727 "is_configured": false, 00:25:09.727 "data_offset": 2048, 00:25:09.727 "data_size": 63488 00:25:09.727 }, 00:25:09.727 { 00:25:09.727 "name": null, 00:25:09.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:09.727 "is_configured": false, 00:25:09.727 "data_offset": 2048, 00:25:09.727 "data_size": 63488 00:25:09.727 }, 00:25:09.727 { 00:25:09.727 "name": "BaseBdev3", 00:25:09.727 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:09.727 "is_configured": true, 00:25:09.727 "data_offset": 2048, 00:25:09.727 "data_size": 63488 00:25:09.727 }, 00:25:09.727 { 00:25:09.727 "name": "BaseBdev4", 00:25:09.727 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:09.727 "is_configured": true, 00:25:09.727 "data_offset": 2048, 00:25:09.727 "data_size": 63488 00:25:09.727 } 00:25:09.727 ] 00:25:09.727 }' 00:25:09.727 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:09.727 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.297 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:10.557 "name": "raid_bdev1", 00:25:10.557 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:10.557 "strip_size_kb": 0, 00:25:10.557 "state": "online", 00:25:10.557 "raid_level": "raid1", 00:25:10.557 "superblock": true, 00:25:10.557 "num_base_bdevs": 4, 00:25:10.557 "num_base_bdevs_discovered": 2, 00:25:10.557 "num_base_bdevs_operational": 2, 00:25:10.557 "base_bdevs_list": [ 00:25:10.557 { 00:25:10.557 "name": null, 00:25:10.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.557 "is_configured": false, 00:25:10.557 "data_offset": 2048, 00:25:10.557 "data_size": 63488 00:25:10.557 }, 00:25:10.557 { 00:25:10.557 "name": null, 00:25:10.557 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.557 "is_configured": false, 00:25:10.557 "data_offset": 2048, 00:25:10.557 "data_size": 63488 00:25:10.557 }, 00:25:10.557 { 00:25:10.557 "name": "BaseBdev3", 00:25:10.557 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:10.557 "is_configured": true, 00:25:10.557 "data_offset": 2048, 00:25:10.557 "data_size": 63488 00:25:10.557 }, 00:25:10.557 { 00:25:10.557 "name": "BaseBdev4", 00:25:10.557 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:10.557 "is_configured": true, 00:25:10.557 "data_offset": 2048, 00:25:10.557 "data_size": 63488 00:25:10.557 } 00:25:10.557 ] 00:25:10.557 }' 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:10.557 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:10.558 16:01:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:10.817 [2024-07-12 16:01:31.117469] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:10.817 [2024-07-12 16:01:31.117565] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:10.817 [2024-07-12 16:01:31.117574] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:10.817 request: 00:25:10.817 { 00:25:10.817 "base_bdev": "BaseBdev1", 00:25:10.817 "raid_bdev": "raid_bdev1", 00:25:10.817 "method": "bdev_raid_add_base_bdev", 00:25:10.817 "req_id": 1 00:25:10.817 } 00:25:10.817 Got JSON-RPC error response 00:25:10.817 response: 00:25:10.817 { 00:25:10.817 "code": -22, 00:25:10.817 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:10.817 } 00:25:10.817 16:01:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:10.817 16:01:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:10.817 16:01:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:10.817 16:01:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:10.817 16:01:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.756 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.016 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.016 "name": "raid_bdev1", 00:25:12.016 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:12.016 "strip_size_kb": 0, 00:25:12.016 "state": "online", 00:25:12.016 "raid_level": "raid1", 00:25:12.016 "superblock": true, 00:25:12.016 "num_base_bdevs": 4, 00:25:12.016 "num_base_bdevs_discovered": 2, 00:25:12.016 "num_base_bdevs_operational": 2, 00:25:12.016 "base_bdevs_list": [ 00:25:12.016 { 00:25:12.016 "name": null, 00:25:12.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.016 "is_configured": false, 00:25:12.016 "data_offset": 2048, 00:25:12.016 "data_size": 63488 00:25:12.016 }, 00:25:12.016 { 00:25:12.016 "name": null, 00:25:12.016 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.016 "is_configured": false, 00:25:12.016 "data_offset": 2048, 00:25:12.016 "data_size": 63488 00:25:12.016 }, 00:25:12.016 { 00:25:12.016 "name": "BaseBdev3", 00:25:12.016 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:12.016 "is_configured": true, 00:25:12.016 "data_offset": 2048, 00:25:12.016 "data_size": 63488 00:25:12.016 }, 00:25:12.016 { 00:25:12.016 "name": "BaseBdev4", 00:25:12.016 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:12.016 "is_configured": true, 00:25:12.016 "data_offset": 2048, 00:25:12.016 "data_size": 63488 00:25:12.016 } 00:25:12.016 ] 00:25:12.016 }' 00:25:12.016 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.016 16:01:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:12.584 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.585 16:01:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.844 "name": "raid_bdev1", 00:25:12.844 "uuid": "878e8d81-9116-4fe5-b2b2-9710472611b1", 00:25:12.844 "strip_size_kb": 0, 00:25:12.844 "state": "online", 00:25:12.844 "raid_level": "raid1", 00:25:12.844 "superblock": true, 00:25:12.844 "num_base_bdevs": 4, 00:25:12.844 "num_base_bdevs_discovered": 2, 00:25:12.844 "num_base_bdevs_operational": 2, 00:25:12.844 "base_bdevs_list": [ 00:25:12.844 { 00:25:12.844 "name": null, 00:25:12.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.844 "is_configured": false, 00:25:12.844 "data_offset": 2048, 00:25:12.844 "data_size": 63488 00:25:12.844 }, 00:25:12.844 { 00:25:12.844 "name": null, 00:25:12.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.844 "is_configured": false, 00:25:12.844 "data_offset": 2048, 00:25:12.844 "data_size": 63488 00:25:12.844 }, 00:25:12.844 { 00:25:12.844 "name": "BaseBdev3", 00:25:12.844 "uuid": "18e66c94-8503-59a9-bee5-144910dbbe88", 00:25:12.844 "is_configured": true, 00:25:12.844 "data_offset": 2048, 00:25:12.844 "data_size": 63488 00:25:12.844 }, 00:25:12.844 { 00:25:12.844 "name": "BaseBdev4", 00:25:12.844 "uuid": "88f91335-e323-5a00-a1b6-445984171183", 00:25:12.844 "is_configured": true, 00:25:12.844 "data_offset": 2048, 00:25:12.844 "data_size": 63488 00:25:12.844 } 00:25:12.844 ] 00:25:12.844 }' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2642447 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 2642447 ']' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 2642447 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2642447 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2642447' 00:25:12.844 killing process with pid 2642447 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 2642447 00:25:12.844 Received shutdown signal, test time was about 60.000000 seconds 00:25:12.844 00:25:12.844 Latency(us) 00:25:12.844 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:12.844 =================================================================================================================== 00:25:12.844 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:12.844 [2024-07-12 16:01:33.200913] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:12.844 [2024-07-12 16:01:33.200985] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:12.844 [2024-07-12 16:01:33.201036] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:12.844 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 2642447 00:25:12.844 [2024-07-12 16:01:33.201043] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a42370 name raid_bdev1, state offline 00:25:12.844 [2024-07-12 16:01:33.227189] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:13.104 00:25:13.104 real 0m36.384s 00:25:13.104 user 0m51.447s 00:25:13.104 sys 0m5.677s 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:13.104 ************************************ 00:25:13.104 END TEST raid_rebuild_test_sb 00:25:13.104 ************************************ 00:25:13.104 16:01:33 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:13.104 16:01:33 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:25:13.104 16:01:33 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:13.104 16:01:33 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:13.104 16:01:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:13.104 ************************************ 00:25:13.104 START TEST raid_rebuild_test_io 00:25:13.104 ************************************ 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2648996 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2648996 /var/tmp/spdk-raid.sock 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 2648996 ']' 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:13.104 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:13.104 16:01:33 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:13.104 [2024-07-12 16:01:33.491546] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:25:13.104 [2024-07-12 16:01:33.491603] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2648996 ] 00:25:13.104 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:13.104 Zero copy mechanism will not be used. 00:25:13.364 [2024-07-12 16:01:33.584023] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:13.364 [2024-07-12 16:01:33.646909] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:13.364 [2024-07-12 16:01:33.691707] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:13.364 [2024-07-12 16:01:33.691736] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:13.934 16:01:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:13.934 16:01:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:13.934 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:13.934 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:14.194 BaseBdev1_malloc 00:25:14.194 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:14.454 [2024-07-12 16:01:34.701791] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:14.454 [2024-07-12 16:01:34.701827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.454 [2024-07-12 16:01:34.701843] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d69010 00:25:14.454 [2024-07-12 16:01:34.701849] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.454 [2024-07-12 16:01:34.703141] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.454 [2024-07-12 16:01:34.703162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:14.454 BaseBdev1 00:25:14.454 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:14.454 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:14.714 BaseBdev2_malloc 00:25:14.714 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:14.714 [2024-07-12 16:01:35.084543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:14.714 [2024-07-12 16:01:35.084572] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:14.714 [2024-07-12 16:01:35.084585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d69c30 00:25:14.714 [2024-07-12 16:01:35.084591] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:14.714 [2024-07-12 16:01:35.085757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:14.714 [2024-07-12 16:01:35.085775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:14.714 BaseBdev2 00:25:14.714 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:14.714 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:14.984 BaseBdev3_malloc 00:25:14.984 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:15.287 [2024-07-12 16:01:35.455136] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:15.287 [2024-07-12 16:01:35.455164] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.287 [2024-07-12 16:01:35.455175] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f112c0 00:25:15.287 [2024-07-12 16:01:35.455181] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.287 [2024-07-12 16:01:35.456332] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.287 [2024-07-12 16:01:35.456351] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:15.287 BaseBdev3 00:25:15.287 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:15.287 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:15.288 BaseBdev4_malloc 00:25:15.288 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:15.572 [2024-07-12 16:01:35.833687] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:15.572 [2024-07-12 16:01:35.833718] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:15.572 [2024-07-12 16:01:35.833730] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d614e0 00:25:15.572 [2024-07-12 16:01:35.833736] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:15.572 [2024-07-12 16:01:35.834879] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:15.572 [2024-07-12 16:01:35.834896] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:15.572 BaseBdev4 00:25:15.572 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:15.833 spare_malloc 00:25:15.833 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:15.833 spare_delay 00:25:15.833 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:16.092 [2024-07-12 16:01:36.428788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:16.092 [2024-07-12 16:01:36.428813] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.093 [2024-07-12 16:01:36.428823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d5fc80 00:25:16.093 [2024-07-12 16:01:36.428834] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.093 [2024-07-12 16:01:36.429975] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.093 [2024-07-12 16:01:36.429993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:16.093 spare 00:25:16.093 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:16.354 [2024-07-12 16:01:36.621293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:16.354 [2024-07-12 16:01:36.622250] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:16.354 [2024-07-12 16:01:36.622289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:16.354 [2024-07-12 16:01:36.622323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:16.354 [2024-07-12 16:01:36.622387] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d625b0 00:25:16.354 [2024-07-12 16:01:36.622393] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:16.354 [2024-07-12 16:01:36.622540] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d66b50 00:25:16.354 [2024-07-12 16:01:36.622651] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d625b0 00:25:16.354 [2024-07-12 16:01:36.622656] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1d625b0 00:25:16.354 [2024-07-12 16:01:36.622741] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.354 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.615 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.615 "name": "raid_bdev1", 00:25:16.615 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:16.615 "strip_size_kb": 0, 00:25:16.615 "state": "online", 00:25:16.615 "raid_level": "raid1", 00:25:16.615 "superblock": false, 00:25:16.615 "num_base_bdevs": 4, 00:25:16.615 "num_base_bdevs_discovered": 4, 00:25:16.615 "num_base_bdevs_operational": 4, 00:25:16.615 "base_bdevs_list": [ 00:25:16.615 { 00:25:16.615 "name": "BaseBdev1", 00:25:16.615 "uuid": "7054fb05-cb21-5711-9f3b-115023ea9dd2", 00:25:16.615 "is_configured": true, 00:25:16.615 "data_offset": 0, 00:25:16.615 "data_size": 65536 00:25:16.615 }, 00:25:16.615 { 00:25:16.615 "name": "BaseBdev2", 00:25:16.615 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:16.615 "is_configured": true, 00:25:16.615 "data_offset": 0, 00:25:16.615 "data_size": 65536 00:25:16.615 }, 00:25:16.615 { 00:25:16.615 "name": "BaseBdev3", 00:25:16.615 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:16.615 "is_configured": true, 00:25:16.615 "data_offset": 0, 00:25:16.615 "data_size": 65536 00:25:16.615 }, 00:25:16.615 { 00:25:16.615 "name": "BaseBdev4", 00:25:16.615 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:16.615 "is_configured": true, 00:25:16.615 "data_offset": 0, 00:25:16.615 "data_size": 65536 00:25:16.615 } 00:25:16.615 ] 00:25:16.615 }' 00:25:16.615 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.615 16:01:36 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:17.184 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:17.184 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:17.184 [2024-07-12 16:01:37.547871] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:17.184 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:17.184 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.184 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:17.444 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:17.444 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:17.444 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:17.444 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:17.444 [2024-07-12 16:01:37.857772] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d681b0 00:25:17.444 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:17.444 Zero copy mechanism will not be used. 00:25:17.444 Running I/O for 60 seconds... 00:25:17.705 [2024-07-12 16:01:37.942982] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:17.705 [2024-07-12 16:01:37.949399] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d681b0 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.705 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.964 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.964 "name": "raid_bdev1", 00:25:17.964 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:17.964 "strip_size_kb": 0, 00:25:17.964 "state": "online", 00:25:17.964 "raid_level": "raid1", 00:25:17.964 "superblock": false, 00:25:17.964 "num_base_bdevs": 4, 00:25:17.964 "num_base_bdevs_discovered": 3, 00:25:17.964 "num_base_bdevs_operational": 3, 00:25:17.964 "base_bdevs_list": [ 00:25:17.964 { 00:25:17.964 "name": null, 00:25:17.964 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.964 "is_configured": false, 00:25:17.964 "data_offset": 0, 00:25:17.964 "data_size": 65536 00:25:17.964 }, 00:25:17.964 { 00:25:17.964 "name": "BaseBdev2", 00:25:17.964 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:17.964 "is_configured": true, 00:25:17.964 "data_offset": 0, 00:25:17.964 "data_size": 65536 00:25:17.964 }, 00:25:17.965 { 00:25:17.965 "name": "BaseBdev3", 00:25:17.965 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:17.965 "is_configured": true, 00:25:17.965 "data_offset": 0, 00:25:17.965 "data_size": 65536 00:25:17.965 }, 00:25:17.965 { 00:25:17.965 "name": "BaseBdev4", 00:25:17.965 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:17.965 "is_configured": true, 00:25:17.965 "data_offset": 0, 00:25:17.965 "data_size": 65536 00:25:17.965 } 00:25:17.965 ] 00:25:17.965 }' 00:25:17.965 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.965 16:01:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:18.534 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:18.534 [2024-07-12 16:01:38.877362] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:18.534 [2024-07-12 16:01:38.919396] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a6ece0 00:25:18.534 [2024-07-12 16:01:38.921046] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:18.534 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:18.794 [2024-07-12 16:01:39.034343] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:18.794 [2024-07-12 16:01:39.034933] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:19.053 [2024-07-12 16:01:39.251449] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:19.053 [2024-07-12 16:01:39.251559] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:19.314 [2024-07-12 16:01:39.582469] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:19.314 [2024-07-12 16:01:39.583273] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.573 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:19.833 [2024-07-12 16:01:40.065322] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:19.833 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:19.833 "name": "raid_bdev1", 00:25:19.833 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:19.833 "strip_size_kb": 0, 00:25:19.833 "state": "online", 00:25:19.833 "raid_level": "raid1", 00:25:19.833 "superblock": false, 00:25:19.833 "num_base_bdevs": 4, 00:25:19.833 "num_base_bdevs_discovered": 4, 00:25:19.833 "num_base_bdevs_operational": 4, 00:25:19.833 "process": { 00:25:19.833 "type": "rebuild", 00:25:19.833 "target": "spare", 00:25:19.833 "progress": { 00:25:19.833 "blocks": 14336, 00:25:19.833 "percent": 21 00:25:19.833 } 00:25:19.833 }, 00:25:19.833 "base_bdevs_list": [ 00:25:19.833 { 00:25:19.833 "name": "spare", 00:25:19.833 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:19.833 "is_configured": true, 00:25:19.833 "data_offset": 0, 00:25:19.833 "data_size": 65536 00:25:19.833 }, 00:25:19.833 { 00:25:19.833 "name": "BaseBdev2", 00:25:19.833 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:19.833 "is_configured": true, 00:25:19.833 "data_offset": 0, 00:25:19.833 "data_size": 65536 00:25:19.833 }, 00:25:19.833 { 00:25:19.833 "name": "BaseBdev3", 00:25:19.833 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:19.833 "is_configured": true, 00:25:19.833 "data_offset": 0, 00:25:19.833 "data_size": 65536 00:25:19.833 }, 00:25:19.833 { 00:25:19.833 "name": "BaseBdev4", 00:25:19.833 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:19.833 "is_configured": true, 00:25:19.833 "data_offset": 0, 00:25:19.833 "data_size": 65536 00:25:19.833 } 00:25:19.834 ] 00:25:19.834 }' 00:25:19.834 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:19.834 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:19.834 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:19.834 [2024-07-12 16:01:40.210733] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:19.834 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:19.834 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:20.093 [2024-07-12 16:01:40.405071] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.093 [2024-07-12 16:01:40.531842] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:20.093 [2024-07-12 16:01:40.540149] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:20.093 [2024-07-12 16:01:40.540171] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.093 [2024-07-12 16:01:40.540177] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:20.353 [2024-07-12 16:01:40.563622] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1d681b0 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.353 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.613 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.613 "name": "raid_bdev1", 00:25:20.613 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:20.613 "strip_size_kb": 0, 00:25:20.613 "state": "online", 00:25:20.613 "raid_level": "raid1", 00:25:20.613 "superblock": false, 00:25:20.613 "num_base_bdevs": 4, 00:25:20.613 "num_base_bdevs_discovered": 3, 00:25:20.613 "num_base_bdevs_operational": 3, 00:25:20.613 "base_bdevs_list": [ 00:25:20.613 { 00:25:20.613 "name": null, 00:25:20.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.613 "is_configured": false, 00:25:20.613 "data_offset": 0, 00:25:20.613 "data_size": 65536 00:25:20.613 }, 00:25:20.613 { 00:25:20.613 "name": "BaseBdev2", 00:25:20.613 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:20.613 "is_configured": true, 00:25:20.613 "data_offset": 0, 00:25:20.613 "data_size": 65536 00:25:20.613 }, 00:25:20.613 { 00:25:20.613 "name": "BaseBdev3", 00:25:20.613 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:20.613 "is_configured": true, 00:25:20.613 "data_offset": 0, 00:25:20.613 "data_size": 65536 00:25:20.613 }, 00:25:20.613 { 00:25:20.613 "name": "BaseBdev4", 00:25:20.613 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:20.613 "is_configured": true, 00:25:20.613 "data_offset": 0, 00:25:20.613 "data_size": 65536 00:25:20.613 } 00:25:20.613 ] 00:25:20.613 }' 00:25:20.613 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.613 16:01:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.182 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:21.183 "name": "raid_bdev1", 00:25:21.183 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:21.183 "strip_size_kb": 0, 00:25:21.183 "state": "online", 00:25:21.183 "raid_level": "raid1", 00:25:21.183 "superblock": false, 00:25:21.183 "num_base_bdevs": 4, 00:25:21.183 "num_base_bdevs_discovered": 3, 00:25:21.183 "num_base_bdevs_operational": 3, 00:25:21.183 "base_bdevs_list": [ 00:25:21.183 { 00:25:21.183 "name": null, 00:25:21.183 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:21.183 "is_configured": false, 00:25:21.183 "data_offset": 0, 00:25:21.183 "data_size": 65536 00:25:21.183 }, 00:25:21.183 { 00:25:21.183 "name": "BaseBdev2", 00:25:21.183 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:21.183 "is_configured": true, 00:25:21.183 "data_offset": 0, 00:25:21.183 "data_size": 65536 00:25:21.183 }, 00:25:21.183 { 00:25:21.183 "name": "BaseBdev3", 00:25:21.183 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:21.183 "is_configured": true, 00:25:21.183 "data_offset": 0, 00:25:21.183 "data_size": 65536 00:25:21.183 }, 00:25:21.183 { 00:25:21.183 "name": "BaseBdev4", 00:25:21.183 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:21.183 "is_configured": true, 00:25:21.183 "data_offset": 0, 00:25:21.183 "data_size": 65536 00:25:21.183 } 00:25:21.183 ] 00:25:21.183 }' 00:25:21.183 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:21.183 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:21.183 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:21.442 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:21.443 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:21.443 [2024-07-12 16:01:41.835879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:21.443 [2024-07-12 16:01:41.878604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1db3320 00:25:21.443 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:21.443 [2024-07-12 16:01:41.879794] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:21.702 [2024-07-12 16:01:41.987856] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:21.702 [2024-07-12 16:01:41.988609] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:21.962 [2024-07-12 16:01:42.221827] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:21.962 [2024-07-12 16:01:42.222189] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:22.222 [2024-07-12 16:01:42.570195] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:22.482 [2024-07-12 16:01:42.788863] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.482 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.742 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:22.742 "name": "raid_bdev1", 00:25:22.742 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:22.742 "strip_size_kb": 0, 00:25:22.742 "state": "online", 00:25:22.742 "raid_level": "raid1", 00:25:22.742 "superblock": false, 00:25:22.742 "num_base_bdevs": 4, 00:25:22.742 "num_base_bdevs_discovered": 4, 00:25:22.742 "num_base_bdevs_operational": 4, 00:25:22.742 "process": { 00:25:22.742 "type": "rebuild", 00:25:22.742 "target": "spare", 00:25:22.742 "progress": { 00:25:22.742 "blocks": 12288, 00:25:22.742 "percent": 18 00:25:22.742 } 00:25:22.742 }, 00:25:22.742 "base_bdevs_list": [ 00:25:22.742 { 00:25:22.742 "name": "spare", 00:25:22.742 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:22.742 "is_configured": true, 00:25:22.742 "data_offset": 0, 00:25:22.742 "data_size": 65536 00:25:22.742 }, 00:25:22.742 { 00:25:22.742 "name": "BaseBdev2", 00:25:22.742 "uuid": "f80e04e4-ddf2-5afd-9636-52857c42ded6", 00:25:22.742 "is_configured": true, 00:25:22.742 "data_offset": 0, 00:25:22.742 "data_size": 65536 00:25:22.742 }, 00:25:22.742 { 00:25:22.742 "name": "BaseBdev3", 00:25:22.742 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:22.742 "is_configured": true, 00:25:22.742 "data_offset": 0, 00:25:22.742 "data_size": 65536 00:25:22.742 }, 00:25:22.742 { 00:25:22.742 "name": "BaseBdev4", 00:25:22.742 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:22.742 "is_configured": true, 00:25:22.742 "data_offset": 0, 00:25:22.742 "data_size": 65536 00:25:22.742 } 00:25:22.742 ] 00:25:22.742 }' 00:25:22.742 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:22.742 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:22.743 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:23.002 [2024-07-12 16:01:43.242963] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:23.002 [2024-07-12 16:01:43.342223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:23.002 [2024-07-12 16:01:43.398303] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1d681b0 00:25:23.002 [2024-07-12 16:01:43.398322] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1db3320 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.002 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.263 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.263 "name": "raid_bdev1", 00:25:23.263 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:23.263 "strip_size_kb": 0, 00:25:23.263 "state": "online", 00:25:23.263 "raid_level": "raid1", 00:25:23.263 "superblock": false, 00:25:23.263 "num_base_bdevs": 4, 00:25:23.263 "num_base_bdevs_discovered": 3, 00:25:23.263 "num_base_bdevs_operational": 3, 00:25:23.263 "process": { 00:25:23.263 "type": "rebuild", 00:25:23.263 "target": "spare", 00:25:23.263 "progress": { 00:25:23.263 "blocks": 20480, 00:25:23.263 "percent": 31 00:25:23.263 } 00:25:23.263 }, 00:25:23.263 "base_bdevs_list": [ 00:25:23.263 { 00:25:23.263 "name": "spare", 00:25:23.263 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:23.263 "is_configured": true, 00:25:23.263 "data_offset": 0, 00:25:23.263 "data_size": 65536 00:25:23.263 }, 00:25:23.263 { 00:25:23.263 "name": null, 00:25:23.263 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.263 "is_configured": false, 00:25:23.263 "data_offset": 0, 00:25:23.263 "data_size": 65536 00:25:23.263 }, 00:25:23.263 { 00:25:23.263 "name": "BaseBdev3", 00:25:23.263 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:23.263 "is_configured": true, 00:25:23.263 "data_offset": 0, 00:25:23.263 "data_size": 65536 00:25:23.263 }, 00:25:23.263 { 00:25:23.263 "name": "BaseBdev4", 00:25:23.263 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:23.263 "is_configured": true, 00:25:23.263 "data_offset": 0, 00:25:23.263 "data_size": 65536 00:25:23.263 } 00:25:23.263 ] 00:25:23.263 }' 00:25:23.263 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.263 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.263 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=849 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.523 "name": "raid_bdev1", 00:25:23.523 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:23.523 "strip_size_kb": 0, 00:25:23.523 "state": "online", 00:25:23.523 "raid_level": "raid1", 00:25:23.523 "superblock": false, 00:25:23.523 "num_base_bdevs": 4, 00:25:23.523 "num_base_bdevs_discovered": 3, 00:25:23.523 "num_base_bdevs_operational": 3, 00:25:23.523 "process": { 00:25:23.523 "type": "rebuild", 00:25:23.523 "target": "spare", 00:25:23.523 "progress": { 00:25:23.523 "blocks": 26624, 00:25:23.523 "percent": 40 00:25:23.523 } 00:25:23.523 }, 00:25:23.523 "base_bdevs_list": [ 00:25:23.523 { 00:25:23.523 "name": "spare", 00:25:23.523 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:23.523 "is_configured": true, 00:25:23.523 "data_offset": 0, 00:25:23.523 "data_size": 65536 00:25:23.523 }, 00:25:23.523 { 00:25:23.523 "name": null, 00:25:23.523 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.523 "is_configured": false, 00:25:23.523 "data_offset": 0, 00:25:23.523 "data_size": 65536 00:25:23.523 }, 00:25:23.523 { 00:25:23.523 "name": "BaseBdev3", 00:25:23.523 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:23.523 "is_configured": true, 00:25:23.523 "data_offset": 0, 00:25:23.523 "data_size": 65536 00:25:23.523 }, 00:25:23.523 { 00:25:23.523 "name": "BaseBdev4", 00:25:23.523 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:23.523 "is_configured": true, 00:25:23.523 "data_offset": 0, 00:25:23.523 "data_size": 65536 00:25:23.523 } 00:25:23.523 ] 00:25:23.523 }' 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.523 [2024-07-12 16:01:43.947667] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:23.523 [2024-07-12 16:01:43.947818] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.523 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.783 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.783 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.722 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.982 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.982 "name": "raid_bdev1", 00:25:24.982 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:24.982 "strip_size_kb": 0, 00:25:24.982 "state": "online", 00:25:24.982 "raid_level": "raid1", 00:25:24.982 "superblock": false, 00:25:24.982 "num_base_bdevs": 4, 00:25:24.982 "num_base_bdevs_discovered": 3, 00:25:24.982 "num_base_bdevs_operational": 3, 00:25:24.982 "process": { 00:25:24.982 "type": "rebuild", 00:25:24.982 "target": "spare", 00:25:24.982 "progress": { 00:25:24.983 "blocks": 49152, 00:25:24.983 "percent": 75 00:25:24.983 } 00:25:24.983 }, 00:25:24.983 "base_bdevs_list": [ 00:25:24.983 { 00:25:24.983 "name": "spare", 00:25:24.983 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:24.983 "is_configured": true, 00:25:24.983 "data_offset": 0, 00:25:24.983 "data_size": 65536 00:25:24.983 }, 00:25:24.983 { 00:25:24.983 "name": null, 00:25:24.983 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.983 "is_configured": false, 00:25:24.983 "data_offset": 0, 00:25:24.983 "data_size": 65536 00:25:24.983 }, 00:25:24.983 { 00:25:24.983 "name": "BaseBdev3", 00:25:24.983 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:24.983 "is_configured": true, 00:25:24.983 "data_offset": 0, 00:25:24.983 "data_size": 65536 00:25:24.983 }, 00:25:24.983 { 00:25:24.983 "name": "BaseBdev4", 00:25:24.983 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:24.983 "is_configured": true, 00:25:24.983 "data_offset": 0, 00:25:24.983 "data_size": 65536 00:25:24.983 } 00:25:24.983 ] 00:25:24.983 }' 00:25:24.983 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.983 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:24.983 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.983 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.983 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:25.922 [2024-07-12 16:01:46.007101] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:25.922 [2024-07-12 16:01:46.113790] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:25.922 [2024-07-12 16:01:46.115975] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:25.922 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.182 "name": "raid_bdev1", 00:25:26.182 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:26.182 "strip_size_kb": 0, 00:25:26.182 "state": "online", 00:25:26.182 "raid_level": "raid1", 00:25:26.182 "superblock": false, 00:25:26.182 "num_base_bdevs": 4, 00:25:26.182 "num_base_bdevs_discovered": 3, 00:25:26.182 "num_base_bdevs_operational": 3, 00:25:26.182 "base_bdevs_list": [ 00:25:26.182 { 00:25:26.182 "name": "spare", 00:25:26.182 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:26.182 "is_configured": true, 00:25:26.182 "data_offset": 0, 00:25:26.182 "data_size": 65536 00:25:26.182 }, 00:25:26.182 { 00:25:26.182 "name": null, 00:25:26.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.182 "is_configured": false, 00:25:26.182 "data_offset": 0, 00:25:26.182 "data_size": 65536 00:25:26.182 }, 00:25:26.182 { 00:25:26.182 "name": "BaseBdev3", 00:25:26.182 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:26.182 "is_configured": true, 00:25:26.182 "data_offset": 0, 00:25:26.182 "data_size": 65536 00:25:26.182 }, 00:25:26.182 { 00:25:26.182 "name": "BaseBdev4", 00:25:26.182 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:26.182 "is_configured": true, 00:25:26.182 "data_offset": 0, 00:25:26.182 "data_size": 65536 00:25:26.182 } 00:25:26.182 ] 00:25:26.182 }' 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.182 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.442 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:26.442 "name": "raid_bdev1", 00:25:26.442 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:26.442 "strip_size_kb": 0, 00:25:26.442 "state": "online", 00:25:26.442 "raid_level": "raid1", 00:25:26.442 "superblock": false, 00:25:26.442 "num_base_bdevs": 4, 00:25:26.442 "num_base_bdevs_discovered": 3, 00:25:26.442 "num_base_bdevs_operational": 3, 00:25:26.442 "base_bdevs_list": [ 00:25:26.442 { 00:25:26.442 "name": "spare", 00:25:26.442 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:26.442 "is_configured": true, 00:25:26.442 "data_offset": 0, 00:25:26.442 "data_size": 65536 00:25:26.442 }, 00:25:26.442 { 00:25:26.442 "name": null, 00:25:26.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.442 "is_configured": false, 00:25:26.442 "data_offset": 0, 00:25:26.442 "data_size": 65536 00:25:26.442 }, 00:25:26.442 { 00:25:26.442 "name": "BaseBdev3", 00:25:26.442 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:26.442 "is_configured": true, 00:25:26.442 "data_offset": 0, 00:25:26.442 "data_size": 65536 00:25:26.442 }, 00:25:26.442 { 00:25:26.442 "name": "BaseBdev4", 00:25:26.442 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:26.442 "is_configured": true, 00:25:26.442 "data_offset": 0, 00:25:26.442 "data_size": 65536 00:25:26.442 } 00:25:26.442 ] 00:25:26.442 }' 00:25:26.442 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:26.442 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:26.442 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.702 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.702 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.702 "name": "raid_bdev1", 00:25:26.702 "uuid": "ecc0f45d-5392-449d-b2ee-62baa7f0ff89", 00:25:26.702 "strip_size_kb": 0, 00:25:26.702 "state": "online", 00:25:26.702 "raid_level": "raid1", 00:25:26.702 "superblock": false, 00:25:26.702 "num_base_bdevs": 4, 00:25:26.702 "num_base_bdevs_discovered": 3, 00:25:26.702 "num_base_bdevs_operational": 3, 00:25:26.702 "base_bdevs_list": [ 00:25:26.702 { 00:25:26.702 "name": "spare", 00:25:26.702 "uuid": "5808654e-0710-5db7-8ab5-caf510178061", 00:25:26.702 "is_configured": true, 00:25:26.702 "data_offset": 0, 00:25:26.702 "data_size": 65536 00:25:26.702 }, 00:25:26.702 { 00:25:26.702 "name": null, 00:25:26.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.702 "is_configured": false, 00:25:26.702 "data_offset": 0, 00:25:26.702 "data_size": 65536 00:25:26.702 }, 00:25:26.702 { 00:25:26.702 "name": "BaseBdev3", 00:25:26.702 "uuid": "ee79fc3b-c2c8-559c-a386-c1b220840c89", 00:25:26.702 "is_configured": true, 00:25:26.702 "data_offset": 0, 00:25:26.702 "data_size": 65536 00:25:26.702 }, 00:25:26.702 { 00:25:26.702 "name": "BaseBdev4", 00:25:26.702 "uuid": "a444dbc8-9c87-50aa-9a5e-e2b85bd06f43", 00:25:26.702 "is_configured": true, 00:25:26.702 "data_offset": 0, 00:25:26.702 "data_size": 65536 00:25:26.702 } 00:25:26.702 ] 00:25:26.702 }' 00:25:26.702 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.702 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:27.271 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:27.530 [2024-07-12 16:01:47.858199] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:27.530 [2024-07-12 16:01:47.858221] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:27.530 00:25:27.530 Latency(us) 00:25:27.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:27.530 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:27.530 raid_bdev1 : 10.08 99.15 297.46 0.00 0.00 14273.97 253.64 116956.55 00:25:27.530 =================================================================================================================== 00:25:27.530 Total : 99.15 297.46 0.00 0.00 14273.97 253.64 116956.55 00:25:27.530 [2024-07-12 16:01:47.961653] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.531 [2024-07-12 16:01:47.961678] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:27.531 [2024-07-12 16:01:47.961757] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:27.531 [2024-07-12 16:01:47.961764] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d625b0 name raid_bdev1, state offline 00:25:27.531 0 00:25:27.790 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.790 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:27.790 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:28.050 /dev/nbd0 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.050 1+0 records in 00:25:28.050 1+0 records out 00:25:28.050 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000247881 s, 16.5 MB/s 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:28.050 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:28.309 /dev/nbd1 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.309 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.309 1+0 records in 00:25:28.309 1+0 records out 00:25:28.310 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000296969 s, 13.8 MB/s 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.310 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:28.569 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:28.570 16:01:48 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:28.830 /dev/nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:28.830 1+0 records in 00:25:28.830 1+0 records out 00:25:28.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264279 s, 15.5 MB/s 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:28.830 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:29.090 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2648996 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 2648996 ']' 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 2648996 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2648996 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2648996' 00:25:29.350 killing process with pid 2648996 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 2648996 00:25:29.350 Received shutdown signal, test time was about 11.752449 seconds 00:25:29.350 00:25:29.350 Latency(us) 00:25:29.350 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.350 =================================================================================================================== 00:25:29.350 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:29.350 [2024-07-12 16:01:49.639503] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 2648996 00:25:29.350 [2024-07-12 16:01:49.662310] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:29.350 00:25:29.350 real 0m16.363s 00:25:29.350 user 0m25.773s 00:25:29.350 sys 0m2.166s 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:29.350 16:01:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:29.350 ************************************ 00:25:29.350 END TEST raid_rebuild_test_io 00:25:29.350 ************************************ 00:25:29.611 16:01:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:29.611 16:01:49 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:29.611 16:01:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:29.611 16:01:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:29.611 16:01:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:29.611 ************************************ 00:25:29.611 START TEST raid_rebuild_test_sb_io 00:25:29.611 ************************************ 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2652002 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2652002 /var/tmp/spdk-raid.sock 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 2652002 ']' 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:29.611 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:29.611 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:29.611 [2024-07-12 16:01:49.934681] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:25:29.611 [2024-07-12 16:01:49.934747] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2652002 ] 00:25:29.611 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:29.611 Zero copy mechanism will not be used. 00:25:29.611 [2024-07-12 16:01:50.025603] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:29.872 [2024-07-12 16:01:50.105523] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:29.872 [2024-07-12 16:01:50.156303] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:29.872 [2024-07-12 16:01:50.156329] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:30.441 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:30.441 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:30.441 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:30.441 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:30.702 BaseBdev1_malloc 00:25:30.702 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:30.702 [2024-07-12 16:01:51.090996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:30.702 [2024-07-12 16:01:51.091029] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:30.702 [2024-07-12 16:01:51.091043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2c010 00:25:30.702 [2024-07-12 16:01:51.091050] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:30.702 [2024-07-12 16:01:51.092348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:30.702 [2024-07-12 16:01:51.092367] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:30.702 BaseBdev1 00:25:30.702 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:30.702 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:30.961 BaseBdev2_malloc 00:25:30.962 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:31.222 [2024-07-12 16:01:51.461766] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:31.222 [2024-07-12 16:01:51.461792] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.222 [2024-07-12 16:01:51.461810] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a2cc30 00:25:31.222 [2024-07-12 16:01:51.461816] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.222 [2024-07-12 16:01:51.462979] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.222 [2024-07-12 16:01:51.462996] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:31.222 BaseBdev2 00:25:31.222 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:31.222 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:31.481 BaseBdev3_malloc 00:25:31.481 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:31.481 [2024-07-12 16:01:51.864591] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:31.481 [2024-07-12 16:01:51.864617] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.481 [2024-07-12 16:01:51.864628] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bd42c0 00:25:31.481 [2024-07-12 16:01:51.864634] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.481 [2024-07-12 16:01:51.865806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.481 [2024-07-12 16:01:51.865824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:31.481 BaseBdev3 00:25:31.481 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:31.481 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:31.741 BaseBdev4_malloc 00:25:31.741 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:32.000 [2024-07-12 16:01:52.267459] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:32.000 [2024-07-12 16:01:52.267485] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.000 [2024-07-12 16:01:52.267498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a244e0 00:25:32.001 [2024-07-12 16:01:52.267505] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.001 [2024-07-12 16:01:52.268675] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.001 [2024-07-12 16:01:52.268694] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:32.001 BaseBdev4 00:25:32.001 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:32.260 spare_malloc 00:25:32.260 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:32.260 spare_delay 00:25:32.260 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:32.521 [2024-07-12 16:01:52.838876] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:32.521 [2024-07-12 16:01:52.838904] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.521 [2024-07-12 16:01:52.838917] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a22c80 00:25:32.521 [2024-07-12 16:01:52.838923] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.521 [2024-07-12 16:01:52.840122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.521 [2024-07-12 16:01:52.840145] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:32.521 spare 00:25:32.521 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:32.781 [2024-07-12 16:01:53.027378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:32.781 [2024-07-12 16:01:53.028448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:32.781 [2024-07-12 16:01:53.028489] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:32.781 [2024-07-12 16:01:53.028523] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:32.781 [2024-07-12 16:01:53.028666] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a255b0 00:25:32.781 [2024-07-12 16:01:53.028673] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:32.781 [2024-07-12 16:01:53.028830] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bd12a0 00:25:32.781 [2024-07-12 16:01:53.028945] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a255b0 00:25:32.781 [2024-07-12 16:01:53.028951] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a255b0 00:25:32.781 [2024-07-12 16:01:53.029019] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.781 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.040 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.040 "name": "raid_bdev1", 00:25:33.040 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:33.040 "strip_size_kb": 0, 00:25:33.040 "state": "online", 00:25:33.040 "raid_level": "raid1", 00:25:33.040 "superblock": true, 00:25:33.040 "num_base_bdevs": 4, 00:25:33.040 "num_base_bdevs_discovered": 4, 00:25:33.040 "num_base_bdevs_operational": 4, 00:25:33.040 "base_bdevs_list": [ 00:25:33.040 { 00:25:33.040 "name": "BaseBdev1", 00:25:33.040 "uuid": "6d6290b3-94b7-5564-b115-21b715d4b0ef", 00:25:33.040 "is_configured": true, 00:25:33.040 "data_offset": 2048, 00:25:33.040 "data_size": 63488 00:25:33.040 }, 00:25:33.040 { 00:25:33.040 "name": "BaseBdev2", 00:25:33.040 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:33.040 "is_configured": true, 00:25:33.040 "data_offset": 2048, 00:25:33.040 "data_size": 63488 00:25:33.040 }, 00:25:33.040 { 00:25:33.040 "name": "BaseBdev3", 00:25:33.040 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:33.040 "is_configured": true, 00:25:33.040 "data_offset": 2048, 00:25:33.040 "data_size": 63488 00:25:33.040 }, 00:25:33.040 { 00:25:33.041 "name": "BaseBdev4", 00:25:33.041 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:33.041 "is_configured": true, 00:25:33.041 "data_offset": 2048, 00:25:33.041 "data_size": 63488 00:25:33.041 } 00:25:33.041 ] 00:25:33.041 }' 00:25:33.041 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.041 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:33.608 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:33.608 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:33.608 [2024-07-12 16:01:53.994031] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:33.608 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:33.608 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.608 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:33.878 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:33.878 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:33.878 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:33.878 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:33.878 [2024-07-12 16:01:54.295945] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a2b120 00:25:33.878 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:33.878 Zero copy mechanism will not be used. 00:25:33.878 Running I/O for 60 seconds... 00:25:34.190 [2024-07-12 16:01:54.391099] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:34.190 [2024-07-12 16:01:54.397718] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a2b120 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.190 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.462 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.462 "name": "raid_bdev1", 00:25:34.462 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:34.462 "strip_size_kb": 0, 00:25:34.462 "state": "online", 00:25:34.462 "raid_level": "raid1", 00:25:34.462 "superblock": true, 00:25:34.462 "num_base_bdevs": 4, 00:25:34.462 "num_base_bdevs_discovered": 3, 00:25:34.462 "num_base_bdevs_operational": 3, 00:25:34.462 "base_bdevs_list": [ 00:25:34.462 { 00:25:34.462 "name": null, 00:25:34.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.462 "is_configured": false, 00:25:34.462 "data_offset": 2048, 00:25:34.462 "data_size": 63488 00:25:34.462 }, 00:25:34.462 { 00:25:34.462 "name": "BaseBdev2", 00:25:34.462 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:34.462 "is_configured": true, 00:25:34.462 "data_offset": 2048, 00:25:34.462 "data_size": 63488 00:25:34.462 }, 00:25:34.462 { 00:25:34.462 "name": "BaseBdev3", 00:25:34.462 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:34.462 "is_configured": true, 00:25:34.462 "data_offset": 2048, 00:25:34.462 "data_size": 63488 00:25:34.462 }, 00:25:34.462 { 00:25:34.462 "name": "BaseBdev4", 00:25:34.462 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:34.462 "is_configured": true, 00:25:34.462 "data_offset": 2048, 00:25:34.462 "data_size": 63488 00:25:34.462 } 00:25:34.462 ] 00:25:34.462 }' 00:25:34.462 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.462 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.031 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:35.031 [2024-07-12 16:01:55.428596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:35.031 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:35.031 [2024-07-12 16:01:55.471776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a28230 00:25:35.031 [2024-07-12 16:01:55.473408] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:35.290 [2024-07-12 16:01:55.589601] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:35.290 [2024-07-12 16:01:55.589848] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:35.549 [2024-07-12 16:01:55.807839] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:35.549 [2024-07-12 16:01:55.807948] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:36.117 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.118 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.118 [2024-07-12 16:01:56.499593] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:36.377 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:36.377 "name": "raid_bdev1", 00:25:36.377 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:36.377 "strip_size_kb": 0, 00:25:36.377 "state": "online", 00:25:36.377 "raid_level": "raid1", 00:25:36.377 "superblock": true, 00:25:36.377 "num_base_bdevs": 4, 00:25:36.377 "num_base_bdevs_discovered": 4, 00:25:36.377 "num_base_bdevs_operational": 4, 00:25:36.377 "process": { 00:25:36.377 "type": "rebuild", 00:25:36.377 "target": "spare", 00:25:36.377 "progress": { 00:25:36.377 "blocks": 14336, 00:25:36.377 "percent": 22 00:25:36.377 } 00:25:36.377 }, 00:25:36.377 "base_bdevs_list": [ 00:25:36.377 { 00:25:36.377 "name": "spare", 00:25:36.377 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:36.377 "is_configured": true, 00:25:36.377 "data_offset": 2048, 00:25:36.377 "data_size": 63488 00:25:36.377 }, 00:25:36.377 { 00:25:36.377 "name": "BaseBdev2", 00:25:36.377 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:36.377 "is_configured": true, 00:25:36.377 "data_offset": 2048, 00:25:36.377 "data_size": 63488 00:25:36.377 }, 00:25:36.377 { 00:25:36.377 "name": "BaseBdev3", 00:25:36.377 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:36.377 "is_configured": true, 00:25:36.377 "data_offset": 2048, 00:25:36.377 "data_size": 63488 00:25:36.377 }, 00:25:36.378 { 00:25:36.378 "name": "BaseBdev4", 00:25:36.378 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:36.378 "is_configured": true, 00:25:36.378 "data_offset": 2048, 00:25:36.378 "data_size": 63488 00:25:36.378 } 00:25:36.378 ] 00:25:36.378 }' 00:25:36.378 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:36.378 [2024-07-12 16:01:56.716846] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:36.378 [2024-07-12 16:01:56.717002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:36.378 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:36.378 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:36.378 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:36.378 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:36.636 [2024-07-12 16:01:56.928859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.636 [2024-07-12 16:01:57.052507] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:36.636 [2024-07-12 16:01:57.068547] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:36.636 [2024-07-12 16:01:57.068568] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:36.636 [2024-07-12 16:01:57.068573] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:36.636 [2024-07-12 16:01:57.072486] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1a2b120 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.896 "name": "raid_bdev1", 00:25:36.896 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:36.896 "strip_size_kb": 0, 00:25:36.896 "state": "online", 00:25:36.896 "raid_level": "raid1", 00:25:36.896 "superblock": true, 00:25:36.896 "num_base_bdevs": 4, 00:25:36.896 "num_base_bdevs_discovered": 3, 00:25:36.896 "num_base_bdevs_operational": 3, 00:25:36.896 "base_bdevs_list": [ 00:25:36.896 { 00:25:36.896 "name": null, 00:25:36.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.896 "is_configured": false, 00:25:36.896 "data_offset": 2048, 00:25:36.896 "data_size": 63488 00:25:36.896 }, 00:25:36.896 { 00:25:36.896 "name": "BaseBdev2", 00:25:36.896 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:36.896 "is_configured": true, 00:25:36.896 "data_offset": 2048, 00:25:36.896 "data_size": 63488 00:25:36.896 }, 00:25:36.896 { 00:25:36.896 "name": "BaseBdev3", 00:25:36.896 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:36.896 "is_configured": true, 00:25:36.896 "data_offset": 2048, 00:25:36.896 "data_size": 63488 00:25:36.896 }, 00:25:36.896 { 00:25:36.896 "name": "BaseBdev4", 00:25:36.896 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:36.896 "is_configured": true, 00:25:36.896 "data_offset": 2048, 00:25:36.896 "data_size": 63488 00:25:36.896 } 00:25:36.896 ] 00:25:36.896 }' 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.896 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.830 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.089 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:38.089 "name": "raid_bdev1", 00:25:38.089 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:38.089 "strip_size_kb": 0, 00:25:38.089 "state": "online", 00:25:38.089 "raid_level": "raid1", 00:25:38.089 "superblock": true, 00:25:38.089 "num_base_bdevs": 4, 00:25:38.089 "num_base_bdevs_discovered": 3, 00:25:38.089 "num_base_bdevs_operational": 3, 00:25:38.089 "base_bdevs_list": [ 00:25:38.089 { 00:25:38.089 "name": null, 00:25:38.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.089 "is_configured": false, 00:25:38.089 "data_offset": 2048, 00:25:38.089 "data_size": 63488 00:25:38.089 }, 00:25:38.089 { 00:25:38.089 "name": "BaseBdev2", 00:25:38.089 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:38.089 "is_configured": true, 00:25:38.089 "data_offset": 2048, 00:25:38.089 "data_size": 63488 00:25:38.089 }, 00:25:38.089 { 00:25:38.089 "name": "BaseBdev3", 00:25:38.089 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:38.089 "is_configured": true, 00:25:38.089 "data_offset": 2048, 00:25:38.089 "data_size": 63488 00:25:38.089 }, 00:25:38.089 { 00:25:38.089 "name": "BaseBdev4", 00:25:38.089 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:38.089 "is_configured": true, 00:25:38.089 "data_offset": 2048, 00:25:38.089 "data_size": 63488 00:25:38.089 } 00:25:38.089 ] 00:25:38.089 }' 00:25:38.089 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:38.349 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:38.349 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:38.349 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:38.349 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:38.915 [2024-07-12 16:01:59.118705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:38.915 [2024-07-12 16:01:59.168870] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a27b20 00:25:38.915 [2024-07-12 16:01:59.170066] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:38.915 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:38.915 [2024-07-12 16:01:59.286621] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:38.915 [2024-07-12 16:01:59.287400] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:39.173 [2024-07-12 16:01:59.511994] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:39.173 [2024-07-12 16:01:59.512117] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:39.431 [2024-07-12 16:01:59.841501] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:40.000 [2024-07-12 16:02:00.175784] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.000 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.000 [2024-07-12 16:02:00.299403] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:40.446 [2024-07-12 16:02:00.545608] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.446 "name": "raid_bdev1", 00:25:40.446 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:40.446 "strip_size_kb": 0, 00:25:40.446 "state": "online", 00:25:40.446 "raid_level": "raid1", 00:25:40.446 "superblock": true, 00:25:40.446 "num_base_bdevs": 4, 00:25:40.446 "num_base_bdevs_discovered": 4, 00:25:40.446 "num_base_bdevs_operational": 4, 00:25:40.446 "process": { 00:25:40.446 "type": "rebuild", 00:25:40.446 "target": "spare", 00:25:40.446 "progress": { 00:25:40.446 "blocks": 20480, 00:25:40.446 "percent": 32 00:25:40.446 } 00:25:40.446 }, 00:25:40.446 "base_bdevs_list": [ 00:25:40.446 { 00:25:40.446 "name": "spare", 00:25:40.446 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:40.446 "is_configured": true, 00:25:40.446 "data_offset": 2048, 00:25:40.446 "data_size": 63488 00:25:40.446 }, 00:25:40.446 { 00:25:40.446 "name": "BaseBdev2", 00:25:40.446 "uuid": "e281fc3c-ef17-59ef-8c44-5ae2b53aed55", 00:25:40.446 "is_configured": true, 00:25:40.446 "data_offset": 2048, 00:25:40.446 "data_size": 63488 00:25:40.446 }, 00:25:40.446 { 00:25:40.446 "name": "BaseBdev3", 00:25:40.446 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:40.446 "is_configured": true, 00:25:40.446 "data_offset": 2048, 00:25:40.446 "data_size": 63488 00:25:40.446 }, 00:25:40.446 { 00:25:40.446 "name": "BaseBdev4", 00:25:40.446 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:40.446 "is_configured": true, 00:25:40.446 "data_offset": 2048, 00:25:40.446 "data_size": 63488 00:25:40.446 } 00:25:40.446 ] 00:25:40.446 }' 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.446 [2024-07-12 16:02:00.756586] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:40.446 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:40.447 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:40.447 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:25:40.447 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:40.447 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:25:40.447 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:40.706 [2024-07-12 16:02:00.991451] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:40.706 [2024-07-12 16:02:01.108235] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:40.706 [2024-07-12 16:02:01.108635] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:40.966 [2024-07-12 16:02:01.327323] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:41.225 [2024-07-12 16:02:01.579962] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1a2b120 00:25:41.225 [2024-07-12 16:02:01.579981] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1a27b20 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.225 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:41.793 [2024-07-12 16:02:01.935015] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:41.793 [2024-07-12 16:02:02.042825] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:41.793 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.793 "name": "raid_bdev1", 00:25:41.793 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:41.793 "strip_size_kb": 0, 00:25:41.793 "state": "online", 00:25:41.793 "raid_level": "raid1", 00:25:41.793 "superblock": true, 00:25:41.793 "num_base_bdevs": 4, 00:25:41.793 "num_base_bdevs_discovered": 3, 00:25:41.793 "num_base_bdevs_operational": 3, 00:25:41.793 "process": { 00:25:41.793 "type": "rebuild", 00:25:41.793 "target": "spare", 00:25:41.793 "progress": { 00:25:41.793 "blocks": 40960, 00:25:41.793 "percent": 64 00:25:41.793 } 00:25:41.793 }, 00:25:41.793 "base_bdevs_list": [ 00:25:41.793 { 00:25:41.793 "name": "spare", 00:25:41.793 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:41.793 "is_configured": true, 00:25:41.793 "data_offset": 2048, 00:25:41.793 "data_size": 63488 00:25:41.793 }, 00:25:41.793 { 00:25:41.793 "name": null, 00:25:41.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:41.793 "is_configured": false, 00:25:41.793 "data_offset": 2048, 00:25:41.793 "data_size": 63488 00:25:41.793 }, 00:25:41.793 { 00:25:41.793 "name": "BaseBdev3", 00:25:41.793 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:41.793 "is_configured": true, 00:25:41.793 "data_offset": 2048, 00:25:41.793 "data_size": 63488 00:25:41.793 }, 00:25:41.793 { 00:25:41.793 "name": "BaseBdev4", 00:25:41.793 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:41.793 "is_configured": true, 00:25:41.793 "data_offset": 2048, 00:25:41.793 "data_size": 63488 00:25:41.793 } 00:25:41.793 ] 00:25:41.793 }' 00:25:41.793 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.793 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=868 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.053 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.620 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.620 "name": "raid_bdev1", 00:25:42.620 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:42.620 "strip_size_kb": 0, 00:25:42.620 "state": "online", 00:25:42.620 "raid_level": "raid1", 00:25:42.620 "superblock": true, 00:25:42.620 "num_base_bdevs": 4, 00:25:42.620 "num_base_bdevs_discovered": 3, 00:25:42.620 "num_base_bdevs_operational": 3, 00:25:42.620 "process": { 00:25:42.620 "type": "rebuild", 00:25:42.620 "target": "spare", 00:25:42.620 "progress": { 00:25:42.620 "blocks": 53248, 00:25:42.620 "percent": 83 00:25:42.620 } 00:25:42.620 }, 00:25:42.620 "base_bdevs_list": [ 00:25:42.620 { 00:25:42.620 "name": "spare", 00:25:42.620 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:42.620 "is_configured": true, 00:25:42.620 "data_offset": 2048, 00:25:42.620 "data_size": 63488 00:25:42.620 }, 00:25:42.620 { 00:25:42.620 "name": null, 00:25:42.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.620 "is_configured": false, 00:25:42.620 "data_offset": 2048, 00:25:42.620 "data_size": 63488 00:25:42.620 }, 00:25:42.620 { 00:25:42.620 "name": "BaseBdev3", 00:25:42.620 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:42.620 "is_configured": true, 00:25:42.620 "data_offset": 2048, 00:25:42.620 "data_size": 63488 00:25:42.620 }, 00:25:42.620 { 00:25:42.620 "name": "BaseBdev4", 00:25:42.620 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:42.620 "is_configured": true, 00:25:42.620 "data_offset": 2048, 00:25:42.620 "data_size": 63488 00:25:42.620 } 00:25:42.620 ] 00:25:42.620 }' 00:25:42.621 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.621 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.621 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.621 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.621 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:42.879 [2024-07-12 16:02:03.277683] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:43.138 [2024-07-12 16:02:03.377988] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:43.138 [2024-07-12 16:02:03.379447] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.707 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.707 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.707 "name": "raid_bdev1", 00:25:43.707 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:43.707 "strip_size_kb": 0, 00:25:43.707 "state": "online", 00:25:43.707 "raid_level": "raid1", 00:25:43.707 "superblock": true, 00:25:43.707 "num_base_bdevs": 4, 00:25:43.707 "num_base_bdevs_discovered": 3, 00:25:43.707 "num_base_bdevs_operational": 3, 00:25:43.707 "base_bdevs_list": [ 00:25:43.707 { 00:25:43.707 "name": "spare", 00:25:43.707 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:43.707 "is_configured": true, 00:25:43.707 "data_offset": 2048, 00:25:43.707 "data_size": 63488 00:25:43.707 }, 00:25:43.707 { 00:25:43.707 "name": null, 00:25:43.707 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.707 "is_configured": false, 00:25:43.707 "data_offset": 2048, 00:25:43.707 "data_size": 63488 00:25:43.707 }, 00:25:43.707 { 00:25:43.707 "name": "BaseBdev3", 00:25:43.707 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:43.707 "is_configured": true, 00:25:43.707 "data_offset": 2048, 00:25:43.707 "data_size": 63488 00:25:43.707 }, 00:25:43.707 { 00:25:43.707 "name": "BaseBdev4", 00:25:43.707 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:43.707 "is_configured": true, 00:25:43.707 "data_offset": 2048, 00:25:43.707 "data_size": 63488 00:25:43.707 } 00:25:43.707 ] 00:25:43.707 }' 00:25:43.707 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.967 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.228 "name": "raid_bdev1", 00:25:44.228 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:44.228 "strip_size_kb": 0, 00:25:44.228 "state": "online", 00:25:44.228 "raid_level": "raid1", 00:25:44.228 "superblock": true, 00:25:44.228 "num_base_bdevs": 4, 00:25:44.228 "num_base_bdevs_discovered": 3, 00:25:44.228 "num_base_bdevs_operational": 3, 00:25:44.228 "base_bdevs_list": [ 00:25:44.228 { 00:25:44.228 "name": "spare", 00:25:44.228 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:44.228 "is_configured": true, 00:25:44.228 "data_offset": 2048, 00:25:44.228 "data_size": 63488 00:25:44.228 }, 00:25:44.228 { 00:25:44.228 "name": null, 00:25:44.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.228 "is_configured": false, 00:25:44.228 "data_offset": 2048, 00:25:44.228 "data_size": 63488 00:25:44.228 }, 00:25:44.228 { 00:25:44.228 "name": "BaseBdev3", 00:25:44.228 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:44.228 "is_configured": true, 00:25:44.228 "data_offset": 2048, 00:25:44.228 "data_size": 63488 00:25:44.228 }, 00:25:44.228 { 00:25:44.228 "name": "BaseBdev4", 00:25:44.228 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:44.228 "is_configured": true, 00:25:44.228 "data_offset": 2048, 00:25:44.228 "data_size": 63488 00:25:44.228 } 00:25:44.228 ] 00:25:44.228 }' 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.228 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.487 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.487 "name": "raid_bdev1", 00:25:44.487 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:44.487 "strip_size_kb": 0, 00:25:44.487 "state": "online", 00:25:44.487 "raid_level": "raid1", 00:25:44.487 "superblock": true, 00:25:44.487 "num_base_bdevs": 4, 00:25:44.487 "num_base_bdevs_discovered": 3, 00:25:44.487 "num_base_bdevs_operational": 3, 00:25:44.487 "base_bdevs_list": [ 00:25:44.487 { 00:25:44.487 "name": "spare", 00:25:44.487 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:44.487 "is_configured": true, 00:25:44.487 "data_offset": 2048, 00:25:44.487 "data_size": 63488 00:25:44.487 }, 00:25:44.487 { 00:25:44.487 "name": null, 00:25:44.487 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.487 "is_configured": false, 00:25:44.487 "data_offset": 2048, 00:25:44.488 "data_size": 63488 00:25:44.488 }, 00:25:44.488 { 00:25:44.488 "name": "BaseBdev3", 00:25:44.488 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:44.488 "is_configured": true, 00:25:44.488 "data_offset": 2048, 00:25:44.488 "data_size": 63488 00:25:44.488 }, 00:25:44.488 { 00:25:44.488 "name": "BaseBdev4", 00:25:44.488 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:44.488 "is_configured": true, 00:25:44.488 "data_offset": 2048, 00:25:44.488 "data_size": 63488 00:25:44.488 } 00:25:44.488 ] 00:25:44.488 }' 00:25:44.488 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.488 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:45.439 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:45.439 [2024-07-12 16:02:05.783178] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:45.439 [2024-07-12 16:02:05.783200] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:45.439 00:25:45.439 Latency(us) 00:25:45.439 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:45.439 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:45.439 raid_bdev1 : 11.54 117.82 353.47 0.00 0.00 11915.63 248.91 116956.55 00:25:45.439 =================================================================================================================== 00:25:45.440 Total : 117.82 353.47 0.00 0.00 11915.63 248.91 116956.55 00:25:45.440 [2024-07-12 16:02:05.870598] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:45.440 [2024-07-12 16:02:05.870621] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:45.440 [2024-07-12 16:02:05.870695] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:45.440 [2024-07-12 16:02:05.870702] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a255b0 name raid_bdev1, state offline 00:25:45.440 0 00:25:45.708 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.708 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.708 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:45.968 /dev/nbd0 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:45.968 1+0 records in 00:25:45.968 1+0 records out 00:25:45.968 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240609 s, 17.0 MB/s 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.968 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:46.228 /dev/nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.229 1+0 records in 00:25:46.229 1+0 records out 00:25:46.229 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000198803 s, 20.6 MB/s 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.229 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.489 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:46.749 /dev/nbd1 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.749 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.749 1+0 records in 00:25:46.749 1+0 records out 00:25:46.749 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313382 s, 13.1 MB/s 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.749 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:47.009 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.269 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:47.270 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:47.270 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:47.530 [2024-07-12 16:02:07.837661] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:47.530 [2024-07-12 16:02:07.837692] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:47.530 [2024-07-12 16:02:07.837705] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a23ce0 00:25:47.530 [2024-07-12 16:02:07.837716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:47.530 [2024-07-12 16:02:07.839080] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:47.530 [2024-07-12 16:02:07.839100] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:47.530 [2024-07-12 16:02:07.839159] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:47.530 [2024-07-12 16:02:07.839180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:47.530 [2024-07-12 16:02:07.839262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:47.530 [2024-07-12 16:02:07.839318] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:47.530 spare 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.530 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.530 [2024-07-12 16:02:07.939609] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a77720 00:25:47.530 [2024-07-12 16:02:07.939617] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:47.530 [2024-07-12 16:02:07.939979] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a23540 00:25:47.530 [2024-07-12 16:02:07.940100] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a77720 00:25:47.530 [2024-07-12 16:02:07.940106] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a77720 00:25:47.530 [2024-07-12 16:02:07.940188] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:47.790 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.790 "name": "raid_bdev1", 00:25:47.790 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:47.790 "strip_size_kb": 0, 00:25:47.790 "state": "online", 00:25:47.790 "raid_level": "raid1", 00:25:47.790 "superblock": true, 00:25:47.790 "num_base_bdevs": 4, 00:25:47.790 "num_base_bdevs_discovered": 3, 00:25:47.790 "num_base_bdevs_operational": 3, 00:25:47.790 "base_bdevs_list": [ 00:25:47.790 { 00:25:47.790 "name": "spare", 00:25:47.790 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:47.790 "is_configured": true, 00:25:47.790 "data_offset": 2048, 00:25:47.790 "data_size": 63488 00:25:47.790 }, 00:25:47.790 { 00:25:47.790 "name": null, 00:25:47.790 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:47.790 "is_configured": false, 00:25:47.790 "data_offset": 2048, 00:25:47.790 "data_size": 63488 00:25:47.790 }, 00:25:47.790 { 00:25:47.790 "name": "BaseBdev3", 00:25:47.790 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:47.790 "is_configured": true, 00:25:47.790 "data_offset": 2048, 00:25:47.790 "data_size": 63488 00:25:47.790 }, 00:25:47.790 { 00:25:47.790 "name": "BaseBdev4", 00:25:47.790 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:47.790 "is_configured": true, 00:25:47.790 "data_offset": 2048, 00:25:47.790 "data_size": 63488 00:25:47.790 } 00:25:47.790 ] 00:25:47.790 }' 00:25:47.790 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.790 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:48.360 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:48.360 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.360 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:48.360 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:48.360 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.361 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.361 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:48.620 "name": "raid_bdev1", 00:25:48.620 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:48.620 "strip_size_kb": 0, 00:25:48.620 "state": "online", 00:25:48.620 "raid_level": "raid1", 00:25:48.620 "superblock": true, 00:25:48.620 "num_base_bdevs": 4, 00:25:48.620 "num_base_bdevs_discovered": 3, 00:25:48.620 "num_base_bdevs_operational": 3, 00:25:48.620 "base_bdevs_list": [ 00:25:48.620 { 00:25:48.620 "name": "spare", 00:25:48.620 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:48.620 "is_configured": true, 00:25:48.620 "data_offset": 2048, 00:25:48.620 "data_size": 63488 00:25:48.620 }, 00:25:48.620 { 00:25:48.620 "name": null, 00:25:48.620 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:48.620 "is_configured": false, 00:25:48.620 "data_offset": 2048, 00:25:48.620 "data_size": 63488 00:25:48.620 }, 00:25:48.620 { 00:25:48.620 "name": "BaseBdev3", 00:25:48.620 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:48.620 "is_configured": true, 00:25:48.620 "data_offset": 2048, 00:25:48.620 "data_size": 63488 00:25:48.620 }, 00:25:48.620 { 00:25:48.620 "name": "BaseBdev4", 00:25:48.620 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:48.620 "is_configured": true, 00:25:48.620 "data_offset": 2048, 00:25:48.620 "data_size": 63488 00:25:48.620 } 00:25:48.620 ] 00:25:48.620 }' 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.620 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:48.880 [2024-07-12 16:02:09.309776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:48.880 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:49.140 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.140 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.140 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:49.140 "name": "raid_bdev1", 00:25:49.140 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:49.140 "strip_size_kb": 0, 00:25:49.140 "state": "online", 00:25:49.140 "raid_level": "raid1", 00:25:49.140 "superblock": true, 00:25:49.140 "num_base_bdevs": 4, 00:25:49.140 "num_base_bdevs_discovered": 2, 00:25:49.140 "num_base_bdevs_operational": 2, 00:25:49.140 "base_bdevs_list": [ 00:25:49.140 { 00:25:49.140 "name": null, 00:25:49.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.140 "is_configured": false, 00:25:49.140 "data_offset": 2048, 00:25:49.140 "data_size": 63488 00:25:49.140 }, 00:25:49.140 { 00:25:49.140 "name": null, 00:25:49.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:49.140 "is_configured": false, 00:25:49.140 "data_offset": 2048, 00:25:49.140 "data_size": 63488 00:25:49.140 }, 00:25:49.140 { 00:25:49.140 "name": "BaseBdev3", 00:25:49.140 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:49.140 "is_configured": true, 00:25:49.140 "data_offset": 2048, 00:25:49.140 "data_size": 63488 00:25:49.140 }, 00:25:49.140 { 00:25:49.140 "name": "BaseBdev4", 00:25:49.140 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:49.140 "is_configured": true, 00:25:49.140 "data_offset": 2048, 00:25:49.140 "data_size": 63488 00:25:49.140 } 00:25:49.140 ] 00:25:49.140 }' 00:25:49.140 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:49.140 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:49.710 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:50.325 [2024-07-12 16:02:10.581135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:50.325 [2024-07-12 16:02:10.581253] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:50.325 [2024-07-12 16:02:10.581264] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:50.325 [2024-07-12 16:02:10.581289] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:50.325 [2024-07-12 16:02:10.584202] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a2b720 00:25:50.325 [2024-07-12 16:02:10.585819] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:50.325 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.265 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:51.525 "name": "raid_bdev1", 00:25:51.525 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:51.525 "strip_size_kb": 0, 00:25:51.525 "state": "online", 00:25:51.525 "raid_level": "raid1", 00:25:51.525 "superblock": true, 00:25:51.525 "num_base_bdevs": 4, 00:25:51.525 "num_base_bdevs_discovered": 3, 00:25:51.525 "num_base_bdevs_operational": 3, 00:25:51.525 "process": { 00:25:51.525 "type": "rebuild", 00:25:51.525 "target": "spare", 00:25:51.525 "progress": { 00:25:51.525 "blocks": 24576, 00:25:51.525 "percent": 38 00:25:51.525 } 00:25:51.525 }, 00:25:51.525 "base_bdevs_list": [ 00:25:51.525 { 00:25:51.525 "name": "spare", 00:25:51.525 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:51.525 "is_configured": true, 00:25:51.525 "data_offset": 2048, 00:25:51.525 "data_size": 63488 00:25:51.525 }, 00:25:51.525 { 00:25:51.525 "name": null, 00:25:51.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:51.525 "is_configured": false, 00:25:51.525 "data_offset": 2048, 00:25:51.525 "data_size": 63488 00:25:51.525 }, 00:25:51.525 { 00:25:51.525 "name": "BaseBdev3", 00:25:51.525 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:51.525 "is_configured": true, 00:25:51.525 "data_offset": 2048, 00:25:51.525 "data_size": 63488 00:25:51.525 }, 00:25:51.525 { 00:25:51.525 "name": "BaseBdev4", 00:25:51.525 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:51.525 "is_configured": true, 00:25:51.525 "data_offset": 2048, 00:25:51.525 "data_size": 63488 00:25:51.525 } 00:25:51.525 ] 00:25:51.525 }' 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:51.525 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:51.785 [2024-07-12 16:02:12.090799] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:51.785 [2024-07-12 16:02:12.094765] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:51.785 [2024-07-12 16:02:12.094796] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:51.785 [2024-07-12 16:02:12.094806] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:51.785 [2024-07-12 16:02:12.094811] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.785 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.356 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.356 "name": "raid_bdev1", 00:25:52.356 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:52.356 "strip_size_kb": 0, 00:25:52.356 "state": "online", 00:25:52.356 "raid_level": "raid1", 00:25:52.356 "superblock": true, 00:25:52.356 "num_base_bdevs": 4, 00:25:52.356 "num_base_bdevs_discovered": 2, 00:25:52.356 "num_base_bdevs_operational": 2, 00:25:52.356 "base_bdevs_list": [ 00:25:52.356 { 00:25:52.356 "name": null, 00:25:52.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.356 "is_configured": false, 00:25:52.356 "data_offset": 2048, 00:25:52.356 "data_size": 63488 00:25:52.356 }, 00:25:52.356 { 00:25:52.356 "name": null, 00:25:52.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.356 "is_configured": false, 00:25:52.356 "data_offset": 2048, 00:25:52.356 "data_size": 63488 00:25:52.356 }, 00:25:52.356 { 00:25:52.356 "name": "BaseBdev3", 00:25:52.356 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:52.356 "is_configured": true, 00:25:52.356 "data_offset": 2048, 00:25:52.356 "data_size": 63488 00:25:52.356 }, 00:25:52.356 { 00:25:52.356 "name": "BaseBdev4", 00:25:52.356 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:52.356 "is_configured": true, 00:25:52.356 "data_offset": 2048, 00:25:52.356 "data_size": 63488 00:25:52.356 } 00:25:52.356 ] 00:25:52.356 }' 00:25:52.356 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.356 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:53.303 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:53.303 [2024-07-12 16:02:13.550628] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:53.303 [2024-07-12 16:02:13.550661] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.303 [2024-07-12 16:02:13.550676] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a77b90 00:25:53.303 [2024-07-12 16:02:13.550682] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.303 [2024-07-12 16:02:13.550991] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.303 [2024-07-12 16:02:13.551002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:53.303 [2024-07-12 16:02:13.551061] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:53.303 [2024-07-12 16:02:13.551068] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:53.303 [2024-07-12 16:02:13.551074] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:53.303 [2024-07-12 16:02:13.551086] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:53.303 [2024-07-12 16:02:13.554084] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a776d0 00:25:53.303 [2024-07-12 16:02:13.555237] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:53.303 spare 00:25:53.303 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.243 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.813 "name": "raid_bdev1", 00:25:54.813 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:54.813 "strip_size_kb": 0, 00:25:54.813 "state": "online", 00:25:54.813 "raid_level": "raid1", 00:25:54.813 "superblock": true, 00:25:54.813 "num_base_bdevs": 4, 00:25:54.813 "num_base_bdevs_discovered": 3, 00:25:54.813 "num_base_bdevs_operational": 3, 00:25:54.813 "process": { 00:25:54.813 "type": "rebuild", 00:25:54.813 "target": "spare", 00:25:54.813 "progress": { 00:25:54.813 "blocks": 30720, 00:25:54.813 "percent": 48 00:25:54.813 } 00:25:54.813 }, 00:25:54.813 "base_bdevs_list": [ 00:25:54.813 { 00:25:54.813 "name": "spare", 00:25:54.813 "uuid": "1205954c-502b-596d-a9a5-a2de999ad975", 00:25:54.813 "is_configured": true, 00:25:54.813 "data_offset": 2048, 00:25:54.813 "data_size": 63488 00:25:54.813 }, 00:25:54.813 { 00:25:54.813 "name": null, 00:25:54.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.813 "is_configured": false, 00:25:54.813 "data_offset": 2048, 00:25:54.813 "data_size": 63488 00:25:54.813 }, 00:25:54.813 { 00:25:54.813 "name": "BaseBdev3", 00:25:54.813 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:54.813 "is_configured": true, 00:25:54.813 "data_offset": 2048, 00:25:54.813 "data_size": 63488 00:25:54.813 }, 00:25:54.813 { 00:25:54.813 "name": "BaseBdev4", 00:25:54.813 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:54.813 "is_configured": true, 00:25:54.813 "data_offset": 2048, 00:25:54.813 "data_size": 63488 00:25:54.813 } 00:25:54.813 ] 00:25:54.813 }' 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:54.813 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:55.073 [2024-07-12 16:02:15.398293] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.073 [2024-07-12 16:02:15.466444] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:55.073 [2024-07-12 16:02:15.466478] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.073 [2024-07-12 16:02:15.466489] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.073 [2024-07-12 16:02:15.466494] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.073 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.643 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.643 "name": "raid_bdev1", 00:25:55.643 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:55.643 "strip_size_kb": 0, 00:25:55.643 "state": "online", 00:25:55.643 "raid_level": "raid1", 00:25:55.643 "superblock": true, 00:25:55.643 "num_base_bdevs": 4, 00:25:55.643 "num_base_bdevs_discovered": 2, 00:25:55.643 "num_base_bdevs_operational": 2, 00:25:55.643 "base_bdevs_list": [ 00:25:55.643 { 00:25:55.643 "name": null, 00:25:55.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.643 "is_configured": false, 00:25:55.643 "data_offset": 2048, 00:25:55.643 "data_size": 63488 00:25:55.643 }, 00:25:55.643 { 00:25:55.643 "name": null, 00:25:55.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.643 "is_configured": false, 00:25:55.643 "data_offset": 2048, 00:25:55.643 "data_size": 63488 00:25:55.643 }, 00:25:55.643 { 00:25:55.643 "name": "BaseBdev3", 00:25:55.643 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:55.643 "is_configured": true, 00:25:55.643 "data_offset": 2048, 00:25:55.643 "data_size": 63488 00:25:55.643 }, 00:25:55.643 { 00:25:55.643 "name": "BaseBdev4", 00:25:55.643 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:55.643 "is_configured": true, 00:25:55.643 "data_offset": 2048, 00:25:55.643 "data_size": 63488 00:25:55.643 } 00:25:55.643 ] 00:25:55.643 }' 00:25:55.643 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.643 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.583 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.844 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.844 "name": "raid_bdev1", 00:25:56.844 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:56.844 "strip_size_kb": 0, 00:25:56.844 "state": "online", 00:25:56.844 "raid_level": "raid1", 00:25:56.844 "superblock": true, 00:25:56.844 "num_base_bdevs": 4, 00:25:56.844 "num_base_bdevs_discovered": 2, 00:25:56.844 "num_base_bdevs_operational": 2, 00:25:56.844 "base_bdevs_list": [ 00:25:56.844 { 00:25:56.844 "name": null, 00:25:56.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.844 "is_configured": false, 00:25:56.844 "data_offset": 2048, 00:25:56.844 "data_size": 63488 00:25:56.844 }, 00:25:56.844 { 00:25:56.844 "name": null, 00:25:56.844 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.844 "is_configured": false, 00:25:56.844 "data_offset": 2048, 00:25:56.844 "data_size": 63488 00:25:56.844 }, 00:25:56.844 { 00:25:56.844 "name": "BaseBdev3", 00:25:56.844 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:56.844 "is_configured": true, 00:25:56.844 "data_offset": 2048, 00:25:56.844 "data_size": 63488 00:25:56.844 }, 00:25:56.844 { 00:25:56.844 "name": "BaseBdev4", 00:25:56.844 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:56.844 "is_configured": true, 00:25:56.844 "data_offset": 2048, 00:25:56.844 "data_size": 63488 00:25:56.844 } 00:25:56.844 ] 00:25:56.844 }' 00:25:56.844 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.104 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:57.104 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.104 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:57.104 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:57.673 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:57.673 [2024-07-12 16:02:18.065342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:57.673 [2024-07-12 16:02:18.065371] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.673 [2024-07-12 16:02:18.065386] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a76700 00:25:57.673 [2024-07-12 16:02:18.065393] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.673 [2024-07-12 16:02:18.065665] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.673 [2024-07-12 16:02:18.065677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:57.673 [2024-07-12 16:02:18.065727] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:57.673 [2024-07-12 16:02:18.065734] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:57.673 [2024-07-12 16:02:18.065740] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:57.673 BaseBdev1 00:25:57.673 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:59.054 "name": "raid_bdev1", 00:25:59.054 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:59.054 "strip_size_kb": 0, 00:25:59.054 "state": "online", 00:25:59.054 "raid_level": "raid1", 00:25:59.054 "superblock": true, 00:25:59.054 "num_base_bdevs": 4, 00:25:59.054 "num_base_bdevs_discovered": 2, 00:25:59.054 "num_base_bdevs_operational": 2, 00:25:59.054 "base_bdevs_list": [ 00:25:59.054 { 00:25:59.054 "name": null, 00:25:59.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.054 "is_configured": false, 00:25:59.054 "data_offset": 2048, 00:25:59.054 "data_size": 63488 00:25:59.054 }, 00:25:59.054 { 00:25:59.054 "name": null, 00:25:59.054 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.054 "is_configured": false, 00:25:59.054 "data_offset": 2048, 00:25:59.054 "data_size": 63488 00:25:59.054 }, 00:25:59.054 { 00:25:59.054 "name": "BaseBdev3", 00:25:59.054 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:59.054 "is_configured": true, 00:25:59.054 "data_offset": 2048, 00:25:59.054 "data_size": 63488 00:25:59.054 }, 00:25:59.054 { 00:25:59.054 "name": "BaseBdev4", 00:25:59.054 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:59.054 "is_configured": true, 00:25:59.054 "data_offset": 2048, 00:25:59.054 "data_size": 63488 00:25:59.054 } 00:25:59.054 ] 00:25:59.054 }' 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:59.054 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:59.993 "name": "raid_bdev1", 00:25:59.993 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:25:59.993 "strip_size_kb": 0, 00:25:59.993 "state": "online", 00:25:59.993 "raid_level": "raid1", 00:25:59.993 "superblock": true, 00:25:59.993 "num_base_bdevs": 4, 00:25:59.993 "num_base_bdevs_discovered": 2, 00:25:59.993 "num_base_bdevs_operational": 2, 00:25:59.993 "base_bdevs_list": [ 00:25:59.993 { 00:25:59.993 "name": null, 00:25:59.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.993 "is_configured": false, 00:25:59.993 "data_offset": 2048, 00:25:59.993 "data_size": 63488 00:25:59.993 }, 00:25:59.993 { 00:25:59.993 "name": null, 00:25:59.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:59.993 "is_configured": false, 00:25:59.993 "data_offset": 2048, 00:25:59.993 "data_size": 63488 00:25:59.993 }, 00:25:59.993 { 00:25:59.993 "name": "BaseBdev3", 00:25:59.993 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:25:59.993 "is_configured": true, 00:25:59.993 "data_offset": 2048, 00:25:59.993 "data_size": 63488 00:25:59.993 }, 00:25:59.993 { 00:25:59.993 "name": "BaseBdev4", 00:25:59.993 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:25:59.993 "is_configured": true, 00:25:59.993 "data_offset": 2048, 00:25:59.993 "data_size": 63488 00:25:59.993 } 00:25:59.993 ] 00:25:59.993 }' 00:25:59.993 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:00.253 16:02:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:00.822 [2024-07-12 16:02:21.017174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:00.822 [2024-07-12 16:02:21.017263] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:26:00.822 [2024-07-12 16:02:21.017276] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:00.822 request: 00:26:00.822 { 00:26:00.822 "base_bdev": "BaseBdev1", 00:26:00.822 "raid_bdev": "raid_bdev1", 00:26:00.822 "method": "bdev_raid_add_base_bdev", 00:26:00.822 "req_id": 1 00:26:00.822 } 00:26:00.822 Got JSON-RPC error response 00:26:00.822 response: 00:26:00.822 { 00:26:00.822 "code": -22, 00:26:00.822 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:00.822 } 00:26:00.822 16:02:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:00.822 16:02:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:00.822 16:02:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:00.822 16:02:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:00.822 16:02:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.760 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.020 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:02.020 "name": "raid_bdev1", 00:26:02.020 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:26:02.020 "strip_size_kb": 0, 00:26:02.020 "state": "online", 00:26:02.020 "raid_level": "raid1", 00:26:02.020 "superblock": true, 00:26:02.020 "num_base_bdevs": 4, 00:26:02.020 "num_base_bdevs_discovered": 2, 00:26:02.020 "num_base_bdevs_operational": 2, 00:26:02.020 "base_bdevs_list": [ 00:26:02.020 { 00:26:02.020 "name": null, 00:26:02.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.020 "is_configured": false, 00:26:02.020 "data_offset": 2048, 00:26:02.020 "data_size": 63488 00:26:02.020 }, 00:26:02.020 { 00:26:02.020 "name": null, 00:26:02.020 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.020 "is_configured": false, 00:26:02.020 "data_offset": 2048, 00:26:02.020 "data_size": 63488 00:26:02.020 }, 00:26:02.020 { 00:26:02.020 "name": "BaseBdev3", 00:26:02.020 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:26:02.020 "is_configured": true, 00:26:02.020 "data_offset": 2048, 00:26:02.020 "data_size": 63488 00:26:02.020 }, 00:26:02.020 { 00:26:02.020 "name": "BaseBdev4", 00:26:02.020 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:26:02.020 "is_configured": true, 00:26:02.020 "data_offset": 2048, 00:26:02.020 "data_size": 63488 00:26:02.020 } 00:26:02.020 ] 00:26:02.020 }' 00:26:02.020 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:02.020 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.589 16:02:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.850 "name": "raid_bdev1", 00:26:02.850 "uuid": "630a4a8f-e80e-4a29-847e-1d105326d2b2", 00:26:02.850 "strip_size_kb": 0, 00:26:02.850 "state": "online", 00:26:02.850 "raid_level": "raid1", 00:26:02.850 "superblock": true, 00:26:02.850 "num_base_bdevs": 4, 00:26:02.850 "num_base_bdevs_discovered": 2, 00:26:02.850 "num_base_bdevs_operational": 2, 00:26:02.850 "base_bdevs_list": [ 00:26:02.850 { 00:26:02.850 "name": null, 00:26:02.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.850 "is_configured": false, 00:26:02.850 "data_offset": 2048, 00:26:02.850 "data_size": 63488 00:26:02.850 }, 00:26:02.850 { 00:26:02.850 "name": null, 00:26:02.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:02.850 "is_configured": false, 00:26:02.850 "data_offset": 2048, 00:26:02.850 "data_size": 63488 00:26:02.850 }, 00:26:02.850 { 00:26:02.850 "name": "BaseBdev3", 00:26:02.850 "uuid": "7dd76dcc-8894-5d7c-9d17-4d55a69c6cf0", 00:26:02.850 "is_configured": true, 00:26:02.850 "data_offset": 2048, 00:26:02.850 "data_size": 63488 00:26:02.850 }, 00:26:02.850 { 00:26:02.850 "name": "BaseBdev4", 00:26:02.850 "uuid": "e9a3b042-8ab0-5272-88e7-103ec2042b59", 00:26:02.850 "is_configured": true, 00:26:02.850 "data_offset": 2048, 00:26:02.850 "data_size": 63488 00:26:02.850 } 00:26:02.850 ] 00:26:02.850 }' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2652002 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 2652002 ']' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 2652002 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2652002 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2652002' 00:26:02.850 killing process with pid 2652002 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 2652002 00:26:02.850 Received shutdown signal, test time was about 28.907002 seconds 00:26:02.850 00:26:02.850 Latency(us) 00:26:02.850 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:02.850 =================================================================================================================== 00:26:02.850 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:02.850 [2024-07-12 16:02:23.272576] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:02.850 [2024-07-12 16:02:23.272651] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:02.850 [2024-07-12 16:02:23.272698] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:02.850 [2024-07-12 16:02:23.272705] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a77720 name raid_bdev1, state offline 00:26:02.850 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 2652002 00:26:02.850 [2024-07-12 16:02:23.296218] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:03.111 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:03.111 00:26:03.111 real 0m33.561s 00:26:03.111 user 0m55.977s 00:26:03.111 sys 0m3.863s 00:26:03.111 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:03.111 16:02:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:03.111 ************************************ 00:26:03.111 END TEST raid_rebuild_test_sb_io 00:26:03.111 ************************************ 00:26:03.111 16:02:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:03.111 16:02:23 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:26:03.111 16:02:23 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:26:03.111 16:02:23 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:26:03.111 16:02:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:03.111 16:02:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:03.111 16:02:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:03.111 ************************************ 00:26:03.111 START TEST raid_state_function_test_sb_4k 00:26:03.111 ************************************ 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2657972 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2657972' 00:26:03.111 Process raid pid: 2657972 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2657972 /var/tmp/spdk-raid.sock 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2657972 ']' 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:03.111 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:03.111 16:02:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:03.371 [2024-07-12 16:02:23.559604] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:26:03.371 [2024-07-12 16:02:23.559656] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:03.371 [2024-07-12 16:02:23.649898] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:03.371 [2024-07-12 16:02:23.716097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:03.371 [2024-07-12 16:02:23.766935] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:03.371 [2024-07-12 16:02:23.766959] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:04.312 [2024-07-12 16:02:24.574281] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:04.312 [2024-07-12 16:02:24.574307] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:04.312 [2024-07-12 16:02:24.574313] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:04.312 [2024-07-12 16:02:24.574318] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.312 16:02:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:04.882 16:02:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.882 "name": "Existed_Raid", 00:26:04.882 "uuid": "2aa69904-3270-484c-820a-5e735774f6e9", 00:26:04.882 "strip_size_kb": 0, 00:26:04.882 "state": "configuring", 00:26:04.882 "raid_level": "raid1", 00:26:04.882 "superblock": true, 00:26:04.882 "num_base_bdevs": 2, 00:26:04.882 "num_base_bdevs_discovered": 0, 00:26:04.882 "num_base_bdevs_operational": 2, 00:26:04.882 "base_bdevs_list": [ 00:26:04.882 { 00:26:04.882 "name": "BaseBdev1", 00:26:04.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.882 "is_configured": false, 00:26:04.882 "data_offset": 0, 00:26:04.882 "data_size": 0 00:26:04.882 }, 00:26:04.882 { 00:26:04.882 "name": "BaseBdev2", 00:26:04.882 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.882 "is_configured": false, 00:26:04.882 "data_offset": 0, 00:26:04.882 "data_size": 0 00:26:04.882 } 00:26:04.882 ] 00:26:04.882 }' 00:26:04.882 16:02:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.882 16:02:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:05.822 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:05.822 [2024-07-12 16:02:26.246360] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:05.822 [2024-07-12 16:02:26.246379] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x803900 name Existed_Raid, state configuring 00:26:05.822 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:06.082 [2024-07-12 16:02:26.438861] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:06.082 [2024-07-12 16:02:26.438877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:06.082 [2024-07-12 16:02:26.438882] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:06.082 [2024-07-12 16:02:26.438888] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:06.082 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:26:06.342 [2024-07-12 16:02:26.638037] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:06.342 BaseBdev1 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:06.342 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:06.602 16:02:26 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:06.602 [ 00:26:06.602 { 00:26:06.602 "name": "BaseBdev1", 00:26:06.602 "aliases": [ 00:26:06.602 "602c645d-c55f-474d-91fc-54fba2cb7f02" 00:26:06.602 ], 00:26:06.602 "product_name": "Malloc disk", 00:26:06.602 "block_size": 4096, 00:26:06.602 "num_blocks": 8192, 00:26:06.602 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:06.602 "assigned_rate_limits": { 00:26:06.602 "rw_ios_per_sec": 0, 00:26:06.602 "rw_mbytes_per_sec": 0, 00:26:06.602 "r_mbytes_per_sec": 0, 00:26:06.602 "w_mbytes_per_sec": 0 00:26:06.602 }, 00:26:06.602 "claimed": true, 00:26:06.602 "claim_type": "exclusive_write", 00:26:06.602 "zoned": false, 00:26:06.602 "supported_io_types": { 00:26:06.602 "read": true, 00:26:06.602 "write": true, 00:26:06.602 "unmap": true, 00:26:06.602 "flush": true, 00:26:06.602 "reset": true, 00:26:06.602 "nvme_admin": false, 00:26:06.602 "nvme_io": false, 00:26:06.602 "nvme_io_md": false, 00:26:06.602 "write_zeroes": true, 00:26:06.602 "zcopy": true, 00:26:06.602 "get_zone_info": false, 00:26:06.602 "zone_management": false, 00:26:06.602 "zone_append": false, 00:26:06.602 "compare": false, 00:26:06.602 "compare_and_write": false, 00:26:06.602 "abort": true, 00:26:06.602 "seek_hole": false, 00:26:06.602 "seek_data": false, 00:26:06.602 "copy": true, 00:26:06.602 "nvme_iov_md": false 00:26:06.602 }, 00:26:06.602 "memory_domains": [ 00:26:06.602 { 00:26:06.602 "dma_device_id": "system", 00:26:06.602 "dma_device_type": 1 00:26:06.602 }, 00:26:06.602 { 00:26:06.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:06.602 "dma_device_type": 2 00:26:06.602 } 00:26:06.602 ], 00:26:06.602 "driver_specific": {} 00:26:06.602 } 00:26:06.602 ] 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.602 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:06.861 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.861 "name": "Existed_Raid", 00:26:06.861 "uuid": "be9b75a7-e8ff-48f4-a61b-5252bfc10b0d", 00:26:06.861 "strip_size_kb": 0, 00:26:06.861 "state": "configuring", 00:26:06.861 "raid_level": "raid1", 00:26:06.861 "superblock": true, 00:26:06.861 "num_base_bdevs": 2, 00:26:06.861 "num_base_bdevs_discovered": 1, 00:26:06.861 "num_base_bdevs_operational": 2, 00:26:06.861 "base_bdevs_list": [ 00:26:06.861 { 00:26:06.861 "name": "BaseBdev1", 00:26:06.861 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:06.861 "is_configured": true, 00:26:06.861 "data_offset": 256, 00:26:06.862 "data_size": 7936 00:26:06.862 }, 00:26:06.862 { 00:26:06.862 "name": "BaseBdev2", 00:26:06.862 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.862 "is_configured": false, 00:26:06.862 "data_offset": 0, 00:26:06.862 "data_size": 0 00:26:06.862 } 00:26:06.862 ] 00:26:06.862 }' 00:26:06.862 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.862 16:02:27 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:07.801 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:08.062 [2024-07-12 16:02:28.262132] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:08.062 [2024-07-12 16:02:28.262160] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8031d0 name Existed_Raid, state configuring 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:08.062 [2024-07-12 16:02:28.414544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:08.062 [2024-07-12 16:02:28.415666] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:08.062 [2024-07-12 16:02:28.415688] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.062 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:08.321 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.321 "name": "Existed_Raid", 00:26:08.321 "uuid": "df941440-2d00-48a7-8c0a-88522c2d465a", 00:26:08.321 "strip_size_kb": 0, 00:26:08.321 "state": "configuring", 00:26:08.321 "raid_level": "raid1", 00:26:08.321 "superblock": true, 00:26:08.321 "num_base_bdevs": 2, 00:26:08.321 "num_base_bdevs_discovered": 1, 00:26:08.321 "num_base_bdevs_operational": 2, 00:26:08.321 "base_bdevs_list": [ 00:26:08.321 { 00:26:08.321 "name": "BaseBdev1", 00:26:08.321 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:08.321 "is_configured": true, 00:26:08.321 "data_offset": 256, 00:26:08.321 "data_size": 7936 00:26:08.321 }, 00:26:08.321 { 00:26:08.321 "name": "BaseBdev2", 00:26:08.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.321 "is_configured": false, 00:26:08.321 "data_offset": 0, 00:26:08.321 "data_size": 0 00:26:08.321 } 00:26:08.321 ] 00:26:08.321 }' 00:26:08.321 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.321 16:02:28 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:26:09.320 [2024-07-12 16:02:29.682737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:09.320 [2024-07-12 16:02:29.682846] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x803e80 00:26:09.320 [2024-07-12 16:02:29.682854] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:09.320 [2024-07-12 16:02:29.682988] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x502290 00:26:09.320 [2024-07-12 16:02:29.683081] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x803e80 00:26:09.320 [2024-07-12 16:02:29.683087] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x803e80 00:26:09.320 [2024-07-12 16:02:29.683153] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:09.320 BaseBdev2 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:09.320 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:09.580 [ 00:26:09.580 { 00:26:09.580 "name": "BaseBdev2", 00:26:09.580 "aliases": [ 00:26:09.580 "316a478e-30d6-4e95-befa-06fd3f482d71" 00:26:09.580 ], 00:26:09.580 "product_name": "Malloc disk", 00:26:09.580 "block_size": 4096, 00:26:09.580 "num_blocks": 8192, 00:26:09.580 "uuid": "316a478e-30d6-4e95-befa-06fd3f482d71", 00:26:09.580 "assigned_rate_limits": { 00:26:09.580 "rw_ios_per_sec": 0, 00:26:09.580 "rw_mbytes_per_sec": 0, 00:26:09.580 "r_mbytes_per_sec": 0, 00:26:09.580 "w_mbytes_per_sec": 0 00:26:09.580 }, 00:26:09.580 "claimed": true, 00:26:09.580 "claim_type": "exclusive_write", 00:26:09.580 "zoned": false, 00:26:09.580 "supported_io_types": { 00:26:09.580 "read": true, 00:26:09.580 "write": true, 00:26:09.580 "unmap": true, 00:26:09.580 "flush": true, 00:26:09.580 "reset": true, 00:26:09.580 "nvme_admin": false, 00:26:09.580 "nvme_io": false, 00:26:09.580 "nvme_io_md": false, 00:26:09.580 "write_zeroes": true, 00:26:09.580 "zcopy": true, 00:26:09.580 "get_zone_info": false, 00:26:09.580 "zone_management": false, 00:26:09.580 "zone_append": false, 00:26:09.580 "compare": false, 00:26:09.580 "compare_and_write": false, 00:26:09.580 "abort": true, 00:26:09.580 "seek_hole": false, 00:26:09.580 "seek_data": false, 00:26:09.580 "copy": true, 00:26:09.580 "nvme_iov_md": false 00:26:09.580 }, 00:26:09.580 "memory_domains": [ 00:26:09.580 { 00:26:09.580 "dma_device_id": "system", 00:26:09.580 "dma_device_type": 1 00:26:09.580 }, 00:26:09.580 { 00:26:09.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:09.580 "dma_device_type": 2 00:26:09.580 } 00:26:09.580 ], 00:26:09.580 "driver_specific": {} 00:26:09.580 } 00:26:09.580 ] 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:09.580 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:09.581 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:09.581 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:09.581 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:09.581 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:09.581 16:02:29 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:09.581 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:09.581 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:09.581 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:09.581 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:09.581 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:09.841 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:09.841 "name": "Existed_Raid", 00:26:09.841 "uuid": "df941440-2d00-48a7-8c0a-88522c2d465a", 00:26:09.841 "strip_size_kb": 0, 00:26:09.841 "state": "online", 00:26:09.841 "raid_level": "raid1", 00:26:09.841 "superblock": true, 00:26:09.841 "num_base_bdevs": 2, 00:26:09.841 "num_base_bdevs_discovered": 2, 00:26:09.841 "num_base_bdevs_operational": 2, 00:26:09.841 "base_bdevs_list": [ 00:26:09.841 { 00:26:09.841 "name": "BaseBdev1", 00:26:09.841 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:09.841 "is_configured": true, 00:26:09.841 "data_offset": 256, 00:26:09.841 "data_size": 7936 00:26:09.841 }, 00:26:09.841 { 00:26:09.841 "name": "BaseBdev2", 00:26:09.841 "uuid": "316a478e-30d6-4e95-befa-06fd3f482d71", 00:26:09.841 "is_configured": true, 00:26:09.841 "data_offset": 256, 00:26:09.841 "data_size": 7936 00:26:09.841 } 00:26:09.841 ] 00:26:09.841 }' 00:26:09.841 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:09.841 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:10.410 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:10.670 [2024-07-12 16:02:30.958174] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:10.670 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:10.670 "name": "Existed_Raid", 00:26:10.670 "aliases": [ 00:26:10.670 "df941440-2d00-48a7-8c0a-88522c2d465a" 00:26:10.670 ], 00:26:10.670 "product_name": "Raid Volume", 00:26:10.670 "block_size": 4096, 00:26:10.670 "num_blocks": 7936, 00:26:10.670 "uuid": "df941440-2d00-48a7-8c0a-88522c2d465a", 00:26:10.670 "assigned_rate_limits": { 00:26:10.670 "rw_ios_per_sec": 0, 00:26:10.670 "rw_mbytes_per_sec": 0, 00:26:10.670 "r_mbytes_per_sec": 0, 00:26:10.670 "w_mbytes_per_sec": 0 00:26:10.670 }, 00:26:10.670 "claimed": false, 00:26:10.670 "zoned": false, 00:26:10.670 "supported_io_types": { 00:26:10.670 "read": true, 00:26:10.670 "write": true, 00:26:10.670 "unmap": false, 00:26:10.670 "flush": false, 00:26:10.670 "reset": true, 00:26:10.670 "nvme_admin": false, 00:26:10.670 "nvme_io": false, 00:26:10.670 "nvme_io_md": false, 00:26:10.670 "write_zeroes": true, 00:26:10.670 "zcopy": false, 00:26:10.670 "get_zone_info": false, 00:26:10.670 "zone_management": false, 00:26:10.670 "zone_append": false, 00:26:10.670 "compare": false, 00:26:10.670 "compare_and_write": false, 00:26:10.670 "abort": false, 00:26:10.670 "seek_hole": false, 00:26:10.670 "seek_data": false, 00:26:10.670 "copy": false, 00:26:10.670 "nvme_iov_md": false 00:26:10.670 }, 00:26:10.670 "memory_domains": [ 00:26:10.670 { 00:26:10.670 "dma_device_id": "system", 00:26:10.670 "dma_device_type": 1 00:26:10.670 }, 00:26:10.670 { 00:26:10.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:10.670 "dma_device_type": 2 00:26:10.670 }, 00:26:10.670 { 00:26:10.670 "dma_device_id": "system", 00:26:10.670 "dma_device_type": 1 00:26:10.670 }, 00:26:10.670 { 00:26:10.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:10.670 "dma_device_type": 2 00:26:10.670 } 00:26:10.670 ], 00:26:10.670 "driver_specific": { 00:26:10.670 "raid": { 00:26:10.670 "uuid": "df941440-2d00-48a7-8c0a-88522c2d465a", 00:26:10.670 "strip_size_kb": 0, 00:26:10.670 "state": "online", 00:26:10.670 "raid_level": "raid1", 00:26:10.670 "superblock": true, 00:26:10.670 "num_base_bdevs": 2, 00:26:10.670 "num_base_bdevs_discovered": 2, 00:26:10.670 "num_base_bdevs_operational": 2, 00:26:10.670 "base_bdevs_list": [ 00:26:10.670 { 00:26:10.670 "name": "BaseBdev1", 00:26:10.670 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:10.670 "is_configured": true, 00:26:10.670 "data_offset": 256, 00:26:10.670 "data_size": 7936 00:26:10.670 }, 00:26:10.670 { 00:26:10.670 "name": "BaseBdev2", 00:26:10.670 "uuid": "316a478e-30d6-4e95-befa-06fd3f482d71", 00:26:10.670 "is_configured": true, 00:26:10.670 "data_offset": 256, 00:26:10.670 "data_size": 7936 00:26:10.670 } 00:26:10.670 ] 00:26:10.670 } 00:26:10.670 } 00:26:10.670 }' 00:26:10.670 16:02:30 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:10.670 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:10.670 BaseBdev2' 00:26:10.670 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:10.670 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:10.670 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:10.930 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:10.930 "name": "BaseBdev1", 00:26:10.930 "aliases": [ 00:26:10.930 "602c645d-c55f-474d-91fc-54fba2cb7f02" 00:26:10.930 ], 00:26:10.930 "product_name": "Malloc disk", 00:26:10.930 "block_size": 4096, 00:26:10.930 "num_blocks": 8192, 00:26:10.930 "uuid": "602c645d-c55f-474d-91fc-54fba2cb7f02", 00:26:10.930 "assigned_rate_limits": { 00:26:10.930 "rw_ios_per_sec": 0, 00:26:10.930 "rw_mbytes_per_sec": 0, 00:26:10.930 "r_mbytes_per_sec": 0, 00:26:10.930 "w_mbytes_per_sec": 0 00:26:10.930 }, 00:26:10.930 "claimed": true, 00:26:10.930 "claim_type": "exclusive_write", 00:26:10.930 "zoned": false, 00:26:10.930 "supported_io_types": { 00:26:10.930 "read": true, 00:26:10.930 "write": true, 00:26:10.930 "unmap": true, 00:26:10.930 "flush": true, 00:26:10.930 "reset": true, 00:26:10.930 "nvme_admin": false, 00:26:10.930 "nvme_io": false, 00:26:10.930 "nvme_io_md": false, 00:26:10.930 "write_zeroes": true, 00:26:10.930 "zcopy": true, 00:26:10.930 "get_zone_info": false, 00:26:10.930 "zone_management": false, 00:26:10.930 "zone_append": false, 00:26:10.930 "compare": false, 00:26:10.930 "compare_and_write": false, 00:26:10.930 "abort": true, 00:26:10.930 "seek_hole": false, 00:26:10.930 "seek_data": false, 00:26:10.930 "copy": true, 00:26:10.930 "nvme_iov_md": false 00:26:10.930 }, 00:26:10.930 "memory_domains": [ 00:26:10.930 { 00:26:10.930 "dma_device_id": "system", 00:26:10.930 "dma_device_type": 1 00:26:10.930 }, 00:26:10.930 { 00:26:10.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:10.930 "dma_device_type": 2 00:26:10.930 } 00:26:10.930 ], 00:26:10.930 "driver_specific": {} 00:26:10.930 }' 00:26:10.930 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:10.930 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:11.189 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:11.449 16:02:31 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:12.018 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:12.018 "name": "BaseBdev2", 00:26:12.018 "aliases": [ 00:26:12.018 "316a478e-30d6-4e95-befa-06fd3f482d71" 00:26:12.018 ], 00:26:12.018 "product_name": "Malloc disk", 00:26:12.018 "block_size": 4096, 00:26:12.018 "num_blocks": 8192, 00:26:12.018 "uuid": "316a478e-30d6-4e95-befa-06fd3f482d71", 00:26:12.018 "assigned_rate_limits": { 00:26:12.018 "rw_ios_per_sec": 0, 00:26:12.018 "rw_mbytes_per_sec": 0, 00:26:12.018 "r_mbytes_per_sec": 0, 00:26:12.018 "w_mbytes_per_sec": 0 00:26:12.018 }, 00:26:12.018 "claimed": true, 00:26:12.018 "claim_type": "exclusive_write", 00:26:12.018 "zoned": false, 00:26:12.018 "supported_io_types": { 00:26:12.018 "read": true, 00:26:12.018 "write": true, 00:26:12.018 "unmap": true, 00:26:12.018 "flush": true, 00:26:12.018 "reset": true, 00:26:12.018 "nvme_admin": false, 00:26:12.018 "nvme_io": false, 00:26:12.018 "nvme_io_md": false, 00:26:12.018 "write_zeroes": true, 00:26:12.018 "zcopy": true, 00:26:12.018 "get_zone_info": false, 00:26:12.018 "zone_management": false, 00:26:12.018 "zone_append": false, 00:26:12.018 "compare": false, 00:26:12.018 "compare_and_write": false, 00:26:12.018 "abort": true, 00:26:12.018 "seek_hole": false, 00:26:12.018 "seek_data": false, 00:26:12.018 "copy": true, 00:26:12.018 "nvme_iov_md": false 00:26:12.018 }, 00:26:12.018 "memory_domains": [ 00:26:12.018 { 00:26:12.018 "dma_device_id": "system", 00:26:12.018 "dma_device_type": 1 00:26:12.018 }, 00:26:12.018 { 00:26:12.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:12.018 "dma_device_type": 2 00:26:12.018 } 00:26:12.018 ], 00:26:12.018 "driver_specific": {} 00:26:12.018 }' 00:26:12.018 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:12.018 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:12.277 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:12.536 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:12.536 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:12.536 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:12.536 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:12.536 16:02:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:13.104 [2024-07-12 16:02:33.412282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.104 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:13.673 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.673 "name": "Existed_Raid", 00:26:13.673 "uuid": "df941440-2d00-48a7-8c0a-88522c2d465a", 00:26:13.673 "strip_size_kb": 0, 00:26:13.673 "state": "online", 00:26:13.673 "raid_level": "raid1", 00:26:13.673 "superblock": true, 00:26:13.673 "num_base_bdevs": 2, 00:26:13.673 "num_base_bdevs_discovered": 1, 00:26:13.673 "num_base_bdevs_operational": 1, 00:26:13.673 "base_bdevs_list": [ 00:26:13.673 { 00:26:13.673 "name": null, 00:26:13.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.673 "is_configured": false, 00:26:13.673 "data_offset": 256, 00:26:13.673 "data_size": 7936 00:26:13.673 }, 00:26:13.673 { 00:26:13.673 "name": "BaseBdev2", 00:26:13.673 "uuid": "316a478e-30d6-4e95-befa-06fd3f482d71", 00:26:13.673 "is_configured": true, 00:26:13.673 "data_offset": 256, 00:26:13.673 "data_size": 7936 00:26:13.673 } 00:26:13.673 ] 00:26:13.673 }' 00:26:13.673 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.673 16:02:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:14.612 16:02:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:14.612 16:02:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:14.612 16:02:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.612 16:02:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:14.871 [2024-07-12 16:02:35.244924] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:14.871 [2024-07-12 16:02:35.244984] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:14.871 [2024-07-12 16:02:35.251040] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:14.871 [2024-07-12 16:02:35.251065] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:14.871 [2024-07-12 16:02:35.251072] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x803e80 name Existed_Raid, state offline 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.871 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2657972 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2657972 ']' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2657972 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2657972 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2657972' 00:26:15.131 killing process with pid 2657972 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2657972 00:26:15.131 [2024-07-12 16:02:35.507439] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:15.131 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2657972 00:26:15.131 [2024-07-12 16:02:35.508032] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:15.392 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:26:15.392 00:26:15.392 real 0m12.132s 00:26:15.392 user 0m22.451s 00:26:15.392 sys 0m1.547s 00:26:15.392 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:15.392 16:02:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:15.392 ************************************ 00:26:15.392 END TEST raid_state_function_test_sb_4k 00:26:15.392 ************************************ 00:26:15.392 16:02:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:15.392 16:02:35 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:26:15.392 16:02:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:26:15.392 16:02:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:15.392 16:02:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:15.392 ************************************ 00:26:15.392 START TEST raid_superblock_test_4k 00:26:15.392 ************************************ 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2660285 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2660285 /var/tmp/spdk-raid.sock 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 2660285 ']' 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:15.392 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:15.392 16:02:35 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:15.392 [2024-07-12 16:02:35.769896] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:26:15.392 [2024-07-12 16:02:35.769950] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2660285 ] 00:26:15.652 [2024-07-12 16:02:35.857906] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:15.652 [2024-07-12 16:02:35.926103] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:15.652 [2024-07-12 16:02:35.964705] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:15.652 [2024-07-12 16:02:35.964732] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:16.222 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:26:16.482 malloc1 00:26:16.482 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:16.741 [2024-07-12 16:02:36.958869] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:16.741 [2024-07-12 16:02:36.958902] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:16.741 [2024-07-12 16:02:36.958913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2571b50 00:26:16.741 [2024-07-12 16:02:36.958920] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:16.741 [2024-07-12 16:02:36.960246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:16.741 [2024-07-12 16:02:36.960264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:16.741 pt1 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:16.741 16:02:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:26:16.741 malloc2 00:26:16.741 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:17.001 [2024-07-12 16:02:37.341964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:17.001 [2024-07-12 16:02:37.341991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:17.001 [2024-07-12 16:02:37.342000] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2572df0 00:26:17.001 [2024-07-12 16:02:37.342006] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:17.001 [2024-07-12 16:02:37.343231] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:17.001 [2024-07-12 16:02:37.343249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:17.001 pt2 00:26:17.001 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:17.001 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:17.001 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:17.261 [2024-07-12 16:02:37.530450] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:17.261 [2024-07-12 16:02:37.531461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:17.261 [2024-07-12 16:02:37.531574] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27180f0 00:26:17.261 [2024-07-12 16:02:37.531582] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:17.261 [2024-07-12 16:02:37.531741] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2588a40 00:26:17.261 [2024-07-12 16:02:37.531853] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27180f0 00:26:17.261 [2024-07-12 16:02:37.531859] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27180f0 00:26:17.261 [2024-07-12 16:02:37.531930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.261 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.520 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:17.520 "name": "raid_bdev1", 00:26:17.520 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:17.520 "strip_size_kb": 0, 00:26:17.520 "state": "online", 00:26:17.520 "raid_level": "raid1", 00:26:17.520 "superblock": true, 00:26:17.520 "num_base_bdevs": 2, 00:26:17.520 "num_base_bdevs_discovered": 2, 00:26:17.520 "num_base_bdevs_operational": 2, 00:26:17.520 "base_bdevs_list": [ 00:26:17.520 { 00:26:17.520 "name": "pt1", 00:26:17.520 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:17.520 "is_configured": true, 00:26:17.520 "data_offset": 256, 00:26:17.520 "data_size": 7936 00:26:17.520 }, 00:26:17.520 { 00:26:17.520 "name": "pt2", 00:26:17.520 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:17.520 "is_configured": true, 00:26:17.520 "data_offset": 256, 00:26:17.520 "data_size": 7936 00:26:17.520 } 00:26:17.520 ] 00:26:17.520 }' 00:26:17.520 16:02:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:17.520 16:02:37 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:18.090 [2024-07-12 16:02:38.448935] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:18.090 "name": "raid_bdev1", 00:26:18.090 "aliases": [ 00:26:18.090 "24043341-8370-456f-9cc1-0a354a830552" 00:26:18.090 ], 00:26:18.090 "product_name": "Raid Volume", 00:26:18.090 "block_size": 4096, 00:26:18.090 "num_blocks": 7936, 00:26:18.090 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:18.090 "assigned_rate_limits": { 00:26:18.090 "rw_ios_per_sec": 0, 00:26:18.090 "rw_mbytes_per_sec": 0, 00:26:18.090 "r_mbytes_per_sec": 0, 00:26:18.090 "w_mbytes_per_sec": 0 00:26:18.090 }, 00:26:18.090 "claimed": false, 00:26:18.090 "zoned": false, 00:26:18.090 "supported_io_types": { 00:26:18.090 "read": true, 00:26:18.090 "write": true, 00:26:18.090 "unmap": false, 00:26:18.090 "flush": false, 00:26:18.090 "reset": true, 00:26:18.090 "nvme_admin": false, 00:26:18.090 "nvme_io": false, 00:26:18.090 "nvme_io_md": false, 00:26:18.090 "write_zeroes": true, 00:26:18.090 "zcopy": false, 00:26:18.090 "get_zone_info": false, 00:26:18.090 "zone_management": false, 00:26:18.090 "zone_append": false, 00:26:18.090 "compare": false, 00:26:18.090 "compare_and_write": false, 00:26:18.090 "abort": false, 00:26:18.090 "seek_hole": false, 00:26:18.090 "seek_data": false, 00:26:18.090 "copy": false, 00:26:18.090 "nvme_iov_md": false 00:26:18.090 }, 00:26:18.090 "memory_domains": [ 00:26:18.090 { 00:26:18.090 "dma_device_id": "system", 00:26:18.090 "dma_device_type": 1 00:26:18.090 }, 00:26:18.090 { 00:26:18.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:18.090 "dma_device_type": 2 00:26:18.090 }, 00:26:18.090 { 00:26:18.090 "dma_device_id": "system", 00:26:18.090 "dma_device_type": 1 00:26:18.090 }, 00:26:18.090 { 00:26:18.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:18.090 "dma_device_type": 2 00:26:18.090 } 00:26:18.090 ], 00:26:18.090 "driver_specific": { 00:26:18.090 "raid": { 00:26:18.090 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:18.090 "strip_size_kb": 0, 00:26:18.090 "state": "online", 00:26:18.090 "raid_level": "raid1", 00:26:18.090 "superblock": true, 00:26:18.090 "num_base_bdevs": 2, 00:26:18.090 "num_base_bdevs_discovered": 2, 00:26:18.090 "num_base_bdevs_operational": 2, 00:26:18.090 "base_bdevs_list": [ 00:26:18.090 { 00:26:18.090 "name": "pt1", 00:26:18.090 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:18.090 "is_configured": true, 00:26:18.090 "data_offset": 256, 00:26:18.090 "data_size": 7936 00:26:18.090 }, 00:26:18.090 { 00:26:18.090 "name": "pt2", 00:26:18.090 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:18.090 "is_configured": true, 00:26:18.090 "data_offset": 256, 00:26:18.090 "data_size": 7936 00:26:18.090 } 00:26:18.090 ] 00:26:18.090 } 00:26:18.090 } 00:26:18.090 }' 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:18.090 pt2' 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:18.090 16:02:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:18.660 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:18.660 "name": "pt1", 00:26:18.660 "aliases": [ 00:26:18.660 "00000000-0000-0000-0000-000000000001" 00:26:18.660 ], 00:26:18.660 "product_name": "passthru", 00:26:18.660 "block_size": 4096, 00:26:18.660 "num_blocks": 8192, 00:26:18.660 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:18.660 "assigned_rate_limits": { 00:26:18.660 "rw_ios_per_sec": 0, 00:26:18.660 "rw_mbytes_per_sec": 0, 00:26:18.660 "r_mbytes_per_sec": 0, 00:26:18.660 "w_mbytes_per_sec": 0 00:26:18.660 }, 00:26:18.660 "claimed": true, 00:26:18.660 "claim_type": "exclusive_write", 00:26:18.660 "zoned": false, 00:26:18.660 "supported_io_types": { 00:26:18.660 "read": true, 00:26:18.660 "write": true, 00:26:18.660 "unmap": true, 00:26:18.660 "flush": true, 00:26:18.660 "reset": true, 00:26:18.660 "nvme_admin": false, 00:26:18.660 "nvme_io": false, 00:26:18.660 "nvme_io_md": false, 00:26:18.660 "write_zeroes": true, 00:26:18.660 "zcopy": true, 00:26:18.660 "get_zone_info": false, 00:26:18.660 "zone_management": false, 00:26:18.660 "zone_append": false, 00:26:18.660 "compare": false, 00:26:18.660 "compare_and_write": false, 00:26:18.660 "abort": true, 00:26:18.660 "seek_hole": false, 00:26:18.660 "seek_data": false, 00:26:18.660 "copy": true, 00:26:18.660 "nvme_iov_md": false 00:26:18.660 }, 00:26:18.660 "memory_domains": [ 00:26:18.660 { 00:26:18.660 "dma_device_id": "system", 00:26:18.660 "dma_device_type": 1 00:26:18.660 }, 00:26:18.660 { 00:26:18.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:18.661 "dma_device_type": 2 00:26:18.661 } 00:26:18.661 ], 00:26:18.661 "driver_specific": { 00:26:18.661 "passthru": { 00:26:18.661 "name": "pt1", 00:26:18.661 "base_bdev_name": "malloc1" 00:26:18.661 } 00:26:18.661 } 00:26:18.661 }' 00:26:18.661 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:18.661 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:18.920 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:19.180 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:19.180 "name": "pt2", 00:26:19.180 "aliases": [ 00:26:19.180 "00000000-0000-0000-0000-000000000002" 00:26:19.180 ], 00:26:19.180 "product_name": "passthru", 00:26:19.180 "block_size": 4096, 00:26:19.180 "num_blocks": 8192, 00:26:19.180 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:19.180 "assigned_rate_limits": { 00:26:19.180 "rw_ios_per_sec": 0, 00:26:19.180 "rw_mbytes_per_sec": 0, 00:26:19.180 "r_mbytes_per_sec": 0, 00:26:19.180 "w_mbytes_per_sec": 0 00:26:19.180 }, 00:26:19.180 "claimed": true, 00:26:19.180 "claim_type": "exclusive_write", 00:26:19.180 "zoned": false, 00:26:19.180 "supported_io_types": { 00:26:19.180 "read": true, 00:26:19.180 "write": true, 00:26:19.180 "unmap": true, 00:26:19.180 "flush": true, 00:26:19.180 "reset": true, 00:26:19.180 "nvme_admin": false, 00:26:19.180 "nvme_io": false, 00:26:19.180 "nvme_io_md": false, 00:26:19.180 "write_zeroes": true, 00:26:19.181 "zcopy": true, 00:26:19.181 "get_zone_info": false, 00:26:19.181 "zone_management": false, 00:26:19.181 "zone_append": false, 00:26:19.181 "compare": false, 00:26:19.181 "compare_and_write": false, 00:26:19.181 "abort": true, 00:26:19.181 "seek_hole": false, 00:26:19.181 "seek_data": false, 00:26:19.181 "copy": true, 00:26:19.181 "nvme_iov_md": false 00:26:19.181 }, 00:26:19.181 "memory_domains": [ 00:26:19.181 { 00:26:19.181 "dma_device_id": "system", 00:26:19.181 "dma_device_type": 1 00:26:19.181 }, 00:26:19.181 { 00:26:19.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:19.181 "dma_device_type": 2 00:26:19.181 } 00:26:19.181 ], 00:26:19.181 "driver_specific": { 00:26:19.181 "passthru": { 00:26:19.181 "name": "pt2", 00:26:19.181 "base_bdev_name": "malloc2" 00:26:19.181 } 00:26:19.181 } 00:26:19.181 }' 00:26:19.181 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:19.441 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:19.701 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:19.701 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:19.701 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:19.701 16:02:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:19.701 [2024-07-12 16:02:40.133213] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:19.960 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=24043341-8370-456f-9cc1-0a354a830552 00:26:19.960 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 24043341-8370-456f-9cc1-0a354a830552 ']' 00:26:19.960 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:19.960 [2024-07-12 16:02:40.325486] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:19.960 [2024-07-12 16:02:40.325496] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:19.960 [2024-07-12 16:02:40.325534] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:19.960 [2024-07-12 16:02:40.325572] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:19.960 [2024-07-12 16:02:40.325578] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27180f0 name raid_bdev1, state offline 00:26:19.960 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:19.960 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:20.219 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:20.219 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:20.220 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:20.220 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:20.479 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:20.479 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:20.479 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:20.479 16:02:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:20.739 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:20.739 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:20.739 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:20.739 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:20.740 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:21.000 [2024-07-12 16:02:41.275857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:21.000 [2024-07-12 16:02:41.276919] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:21.000 [2024-07-12 16:02:41.276962] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:21.000 [2024-07-12 16:02:41.276988] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:21.000 [2024-07-12 16:02:41.276998] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:21.000 [2024-07-12 16:02:41.277004] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2716f30 name raid_bdev1, state configuring 00:26:21.000 request: 00:26:21.000 { 00:26:21.000 "name": "raid_bdev1", 00:26:21.000 "raid_level": "raid1", 00:26:21.000 "base_bdevs": [ 00:26:21.000 "malloc1", 00:26:21.000 "malloc2" 00:26:21.000 ], 00:26:21.000 "superblock": false, 00:26:21.000 "method": "bdev_raid_create", 00:26:21.000 "req_id": 1 00:26:21.000 } 00:26:21.000 Got JSON-RPC error response 00:26:21.000 response: 00:26:21.000 { 00:26:21.000 "code": -17, 00:26:21.000 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:21.000 } 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.000 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:21.260 [2024-07-12 16:02:41.660797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:21.260 [2024-07-12 16:02:41.660820] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:21.260 [2024-07-12 16:02:41.660831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2717920 00:26:21.260 [2024-07-12 16:02:41.660836] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:21.260 [2024-07-12 16:02:41.662116] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:21.260 [2024-07-12 16:02:41.662135] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:21.260 [2024-07-12 16:02:41.662177] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:21.260 [2024-07-12 16:02:41.662195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:21.260 pt1 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.260 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.520 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.520 "name": "raid_bdev1", 00:26:21.520 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:21.520 "strip_size_kb": 0, 00:26:21.520 "state": "configuring", 00:26:21.520 "raid_level": "raid1", 00:26:21.520 "superblock": true, 00:26:21.520 "num_base_bdevs": 2, 00:26:21.520 "num_base_bdevs_discovered": 1, 00:26:21.520 "num_base_bdevs_operational": 2, 00:26:21.520 "base_bdevs_list": [ 00:26:21.520 { 00:26:21.520 "name": "pt1", 00:26:21.520 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:21.520 "is_configured": true, 00:26:21.520 "data_offset": 256, 00:26:21.520 "data_size": 7936 00:26:21.520 }, 00:26:21.520 { 00:26:21.520 "name": null, 00:26:21.520 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:21.520 "is_configured": false, 00:26:21.520 "data_offset": 256, 00:26:21.520 "data_size": 7936 00:26:21.520 } 00:26:21.520 ] 00:26:21.520 }' 00:26:21.520 16:02:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.520 16:02:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:22.089 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:22.089 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:22.090 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:22.090 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:22.350 [2024-07-12 16:02:42.575107] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:22.350 [2024-07-12 16:02:42.575133] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.350 [2024-07-12 16:02:42.575143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2571d80 00:26:22.350 [2024-07-12 16:02:42.575149] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.350 [2024-07-12 16:02:42.575407] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.350 [2024-07-12 16:02:42.575418] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:22.350 [2024-07-12 16:02:42.575458] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:22.350 [2024-07-12 16:02:42.575470] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:22.350 [2024-07-12 16:02:42.575546] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2719f20 00:26:22.350 [2024-07-12 16:02:42.575552] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:22.350 [2024-07-12 16:02:42.575685] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2569190 00:26:22.350 [2024-07-12 16:02:42.575791] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2719f20 00:26:22.350 [2024-07-12 16:02:42.575797] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2719f20 00:26:22.350 [2024-07-12 16:02:42.575869] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:22.350 pt2 00:26:22.350 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:22.350 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:22.350 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.351 "name": "raid_bdev1", 00:26:22.351 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:22.351 "strip_size_kb": 0, 00:26:22.351 "state": "online", 00:26:22.351 "raid_level": "raid1", 00:26:22.351 "superblock": true, 00:26:22.351 "num_base_bdevs": 2, 00:26:22.351 "num_base_bdevs_discovered": 2, 00:26:22.351 "num_base_bdevs_operational": 2, 00:26:22.351 "base_bdevs_list": [ 00:26:22.351 { 00:26:22.351 "name": "pt1", 00:26:22.351 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:22.351 "is_configured": true, 00:26:22.351 "data_offset": 256, 00:26:22.351 "data_size": 7936 00:26:22.351 }, 00:26:22.351 { 00:26:22.351 "name": "pt2", 00:26:22.351 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:22.351 "is_configured": true, 00:26:22.351 "data_offset": 256, 00:26:22.351 "data_size": 7936 00:26:22.351 } 00:26:22.351 ] 00:26:22.351 }' 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.351 16:02:42 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:22.925 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:23.185 [2024-07-12 16:02:43.489621] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:23.186 "name": "raid_bdev1", 00:26:23.186 "aliases": [ 00:26:23.186 "24043341-8370-456f-9cc1-0a354a830552" 00:26:23.186 ], 00:26:23.186 "product_name": "Raid Volume", 00:26:23.186 "block_size": 4096, 00:26:23.186 "num_blocks": 7936, 00:26:23.186 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:23.186 "assigned_rate_limits": { 00:26:23.186 "rw_ios_per_sec": 0, 00:26:23.186 "rw_mbytes_per_sec": 0, 00:26:23.186 "r_mbytes_per_sec": 0, 00:26:23.186 "w_mbytes_per_sec": 0 00:26:23.186 }, 00:26:23.186 "claimed": false, 00:26:23.186 "zoned": false, 00:26:23.186 "supported_io_types": { 00:26:23.186 "read": true, 00:26:23.186 "write": true, 00:26:23.186 "unmap": false, 00:26:23.186 "flush": false, 00:26:23.186 "reset": true, 00:26:23.186 "nvme_admin": false, 00:26:23.186 "nvme_io": false, 00:26:23.186 "nvme_io_md": false, 00:26:23.186 "write_zeroes": true, 00:26:23.186 "zcopy": false, 00:26:23.186 "get_zone_info": false, 00:26:23.186 "zone_management": false, 00:26:23.186 "zone_append": false, 00:26:23.186 "compare": false, 00:26:23.186 "compare_and_write": false, 00:26:23.186 "abort": false, 00:26:23.186 "seek_hole": false, 00:26:23.186 "seek_data": false, 00:26:23.186 "copy": false, 00:26:23.186 "nvme_iov_md": false 00:26:23.186 }, 00:26:23.186 "memory_domains": [ 00:26:23.186 { 00:26:23.186 "dma_device_id": "system", 00:26:23.186 "dma_device_type": 1 00:26:23.186 }, 00:26:23.186 { 00:26:23.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.186 "dma_device_type": 2 00:26:23.186 }, 00:26:23.186 { 00:26:23.186 "dma_device_id": "system", 00:26:23.186 "dma_device_type": 1 00:26:23.186 }, 00:26:23.186 { 00:26:23.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.186 "dma_device_type": 2 00:26:23.186 } 00:26:23.186 ], 00:26:23.186 "driver_specific": { 00:26:23.186 "raid": { 00:26:23.186 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:23.186 "strip_size_kb": 0, 00:26:23.186 "state": "online", 00:26:23.186 "raid_level": "raid1", 00:26:23.186 "superblock": true, 00:26:23.186 "num_base_bdevs": 2, 00:26:23.186 "num_base_bdevs_discovered": 2, 00:26:23.186 "num_base_bdevs_operational": 2, 00:26:23.186 "base_bdevs_list": [ 00:26:23.186 { 00:26:23.186 "name": "pt1", 00:26:23.186 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:23.186 "is_configured": true, 00:26:23.186 "data_offset": 256, 00:26:23.186 "data_size": 7936 00:26:23.186 }, 00:26:23.186 { 00:26:23.186 "name": "pt2", 00:26:23.186 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:23.186 "is_configured": true, 00:26:23.186 "data_offset": 256, 00:26:23.186 "data_size": 7936 00:26:23.186 } 00:26:23.186 ] 00:26:23.186 } 00:26:23.186 } 00:26:23.186 }' 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:23.186 pt2' 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:23.186 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:23.446 "name": "pt1", 00:26:23.446 "aliases": [ 00:26:23.446 "00000000-0000-0000-0000-000000000001" 00:26:23.446 ], 00:26:23.446 "product_name": "passthru", 00:26:23.446 "block_size": 4096, 00:26:23.446 "num_blocks": 8192, 00:26:23.446 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:23.446 "assigned_rate_limits": { 00:26:23.446 "rw_ios_per_sec": 0, 00:26:23.446 "rw_mbytes_per_sec": 0, 00:26:23.446 "r_mbytes_per_sec": 0, 00:26:23.446 "w_mbytes_per_sec": 0 00:26:23.446 }, 00:26:23.446 "claimed": true, 00:26:23.446 "claim_type": "exclusive_write", 00:26:23.446 "zoned": false, 00:26:23.446 "supported_io_types": { 00:26:23.446 "read": true, 00:26:23.446 "write": true, 00:26:23.446 "unmap": true, 00:26:23.446 "flush": true, 00:26:23.446 "reset": true, 00:26:23.446 "nvme_admin": false, 00:26:23.446 "nvme_io": false, 00:26:23.446 "nvme_io_md": false, 00:26:23.446 "write_zeroes": true, 00:26:23.446 "zcopy": true, 00:26:23.446 "get_zone_info": false, 00:26:23.446 "zone_management": false, 00:26:23.446 "zone_append": false, 00:26:23.446 "compare": false, 00:26:23.446 "compare_and_write": false, 00:26:23.446 "abort": true, 00:26:23.446 "seek_hole": false, 00:26:23.446 "seek_data": false, 00:26:23.446 "copy": true, 00:26:23.446 "nvme_iov_md": false 00:26:23.446 }, 00:26:23.446 "memory_domains": [ 00:26:23.446 { 00:26:23.446 "dma_device_id": "system", 00:26:23.446 "dma_device_type": 1 00:26:23.446 }, 00:26:23.446 { 00:26:23.446 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.446 "dma_device_type": 2 00:26:23.446 } 00:26:23.446 ], 00:26:23.446 "driver_specific": { 00:26:23.446 "passthru": { 00:26:23.446 "name": "pt1", 00:26:23.446 "base_bdev_name": "malloc1" 00:26:23.446 } 00:26:23.446 } 00:26:23.446 }' 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.446 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.706 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:23.706 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.706 16:02:43 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:23.706 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:23.966 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:23.966 "name": "pt2", 00:26:23.966 "aliases": [ 00:26:23.966 "00000000-0000-0000-0000-000000000002" 00:26:23.966 ], 00:26:23.966 "product_name": "passthru", 00:26:23.966 "block_size": 4096, 00:26:23.966 "num_blocks": 8192, 00:26:23.966 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:23.966 "assigned_rate_limits": { 00:26:23.966 "rw_ios_per_sec": 0, 00:26:23.966 "rw_mbytes_per_sec": 0, 00:26:23.966 "r_mbytes_per_sec": 0, 00:26:23.966 "w_mbytes_per_sec": 0 00:26:23.966 }, 00:26:23.966 "claimed": true, 00:26:23.966 "claim_type": "exclusive_write", 00:26:23.966 "zoned": false, 00:26:23.966 "supported_io_types": { 00:26:23.966 "read": true, 00:26:23.966 "write": true, 00:26:23.966 "unmap": true, 00:26:23.966 "flush": true, 00:26:23.966 "reset": true, 00:26:23.966 "nvme_admin": false, 00:26:23.966 "nvme_io": false, 00:26:23.966 "nvme_io_md": false, 00:26:23.966 "write_zeroes": true, 00:26:23.967 "zcopy": true, 00:26:23.967 "get_zone_info": false, 00:26:23.967 "zone_management": false, 00:26:23.967 "zone_append": false, 00:26:23.967 "compare": false, 00:26:23.967 "compare_and_write": false, 00:26:23.967 "abort": true, 00:26:23.967 "seek_hole": false, 00:26:23.967 "seek_data": false, 00:26:23.967 "copy": true, 00:26:23.967 "nvme_iov_md": false 00:26:23.967 }, 00:26:23.967 "memory_domains": [ 00:26:23.967 { 00:26:23.967 "dma_device_id": "system", 00:26:23.967 "dma_device_type": 1 00:26:23.967 }, 00:26:23.967 { 00:26:23.967 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.967 "dma_device_type": 2 00:26:23.967 } 00:26:23.967 ], 00:26:23.967 "driver_specific": { 00:26:23.967 "passthru": { 00:26:23.967 "name": "pt2", 00:26:23.967 "base_bdev_name": "malloc2" 00:26:23.967 } 00:26:23.967 } 00:26:23.967 }' 00:26:23.967 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.967 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.967 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:23.967 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:24.226 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:24.226 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:24.227 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:24.487 [2024-07-12 16:02:44.820975] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:24.487 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 24043341-8370-456f-9cc1-0a354a830552 '!=' 24043341-8370-456f-9cc1-0a354a830552 ']' 00:26:24.487 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:24.487 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:24.487 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:26:24.487 16:02:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:24.747 [2024-07-12 16:02:45.013284] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.747 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.007 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.007 "name": "raid_bdev1", 00:26:25.007 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:25.007 "strip_size_kb": 0, 00:26:25.007 "state": "online", 00:26:25.007 "raid_level": "raid1", 00:26:25.007 "superblock": true, 00:26:25.007 "num_base_bdevs": 2, 00:26:25.007 "num_base_bdevs_discovered": 1, 00:26:25.007 "num_base_bdevs_operational": 1, 00:26:25.007 "base_bdevs_list": [ 00:26:25.007 { 00:26:25.007 "name": null, 00:26:25.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.007 "is_configured": false, 00:26:25.007 "data_offset": 256, 00:26:25.007 "data_size": 7936 00:26:25.007 }, 00:26:25.007 { 00:26:25.007 "name": "pt2", 00:26:25.007 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:25.007 "is_configured": true, 00:26:25.007 "data_offset": 256, 00:26:25.007 "data_size": 7936 00:26:25.007 } 00:26:25.007 ] 00:26:25.007 }' 00:26:25.007 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.007 16:02:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:25.607 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:25.607 [2024-07-12 16:02:45.927571] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:25.607 [2024-07-12 16:02:45.927585] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:25.607 [2024-07-12 16:02:45.927621] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:25.607 [2024-07-12 16:02:45.927649] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:25.607 [2024-07-12 16:02:45.927655] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2719f20 name raid_bdev1, state offline 00:26:25.607 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.607 16:02:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:25.867 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:25.867 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:25.867 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:25.867 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:25.867 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:26.128 [2024-07-12 16:02:46.505009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:26.128 [2024-07-12 16:02:46.505035] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:26.128 [2024-07-12 16:02:46.505046] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25692a0 00:26:26.128 [2024-07-12 16:02:46.505058] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:26.128 [2024-07-12 16:02:46.506346] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:26.128 [2024-07-12 16:02:46.506365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:26.128 [2024-07-12 16:02:46.506410] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:26.128 [2024-07-12 16:02:46.506428] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:26.128 [2024-07-12 16:02:46.506489] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2568b20 00:26:26.128 [2024-07-12 16:02:46.506495] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:26.128 [2024-07-12 16:02:46.506632] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2569530 00:26:26.128 [2024-07-12 16:02:46.506737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2568b20 00:26:26.128 [2024-07-12 16:02:46.506743] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2568b20 00:26:26.128 [2024-07-12 16:02:46.506814] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:26.128 pt2 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.128 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.389 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.389 "name": "raid_bdev1", 00:26:26.389 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:26.389 "strip_size_kb": 0, 00:26:26.389 "state": "online", 00:26:26.389 "raid_level": "raid1", 00:26:26.389 "superblock": true, 00:26:26.389 "num_base_bdevs": 2, 00:26:26.389 "num_base_bdevs_discovered": 1, 00:26:26.389 "num_base_bdevs_operational": 1, 00:26:26.389 "base_bdevs_list": [ 00:26:26.389 { 00:26:26.389 "name": null, 00:26:26.389 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:26.389 "is_configured": false, 00:26:26.389 "data_offset": 256, 00:26:26.389 "data_size": 7936 00:26:26.389 }, 00:26:26.389 { 00:26:26.389 "name": "pt2", 00:26:26.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:26.389 "is_configured": true, 00:26:26.389 "data_offset": 256, 00:26:26.389 "data_size": 7936 00:26:26.389 } 00:26:26.389 ] 00:26:26.389 }' 00:26:26.389 16:02:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.389 16:02:46 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:26.958 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:27.220 [2024-07-12 16:02:47.427345] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:27.220 [2024-07-12 16:02:47.427359] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:27.220 [2024-07-12 16:02:47.427391] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:27.220 [2024-07-12 16:02:47.427419] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:27.220 [2024-07-12 16:02:47.427429] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2568b20 name raid_bdev1, state offline 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:27.220 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:27.481 [2024-07-12 16:02:47.796265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:27.481 [2024-07-12 16:02:47.796289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:27.481 [2024-07-12 16:02:47.796298] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2568da0 00:26:27.481 [2024-07-12 16:02:47.796304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:27.481 [2024-07-12 16:02:47.797568] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:27.481 [2024-07-12 16:02:47.797586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:27.481 [2024-07-12 16:02:47.797626] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:27.481 [2024-07-12 16:02:47.797643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:27.481 [2024-07-12 16:02:47.797723] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:27.481 [2024-07-12 16:02:47.797730] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:27.481 [2024-07-12 16:02:47.797738] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271aa70 name raid_bdev1, state configuring 00:26:27.481 [2024-07-12 16:02:47.797751] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:27.481 [2024-07-12 16:02:47.797789] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2719670 00:26:27.481 [2024-07-12 16:02:47.797795] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:27.481 [2024-07-12 16:02:47.797928] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2571820 00:26:27.481 [2024-07-12 16:02:47.798023] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2719670 00:26:27.481 [2024-07-12 16:02:47.798028] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2719670 00:26:27.481 [2024-07-12 16:02:47.798098] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:27.481 pt1 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.481 16:02:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.742 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.742 "name": "raid_bdev1", 00:26:27.742 "uuid": "24043341-8370-456f-9cc1-0a354a830552", 00:26:27.742 "strip_size_kb": 0, 00:26:27.742 "state": "online", 00:26:27.742 "raid_level": "raid1", 00:26:27.742 "superblock": true, 00:26:27.742 "num_base_bdevs": 2, 00:26:27.742 "num_base_bdevs_discovered": 1, 00:26:27.742 "num_base_bdevs_operational": 1, 00:26:27.742 "base_bdevs_list": [ 00:26:27.742 { 00:26:27.742 "name": null, 00:26:27.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.742 "is_configured": false, 00:26:27.742 "data_offset": 256, 00:26:27.742 "data_size": 7936 00:26:27.742 }, 00:26:27.742 { 00:26:27.742 "name": "pt2", 00:26:27.742 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:27.742 "is_configured": true, 00:26:27.742 "data_offset": 256, 00:26:27.742 "data_size": 7936 00:26:27.742 } 00:26:27.742 ] 00:26:27.742 }' 00:26:27.742 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.742 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.313 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:28.313 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:28.313 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:28.313 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:28.313 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:28.572 [2024-07-12 16:02:48.867150] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 24043341-8370-456f-9cc1-0a354a830552 '!=' 24043341-8370-456f-9cc1-0a354a830552 ']' 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2660285 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 2660285 ']' 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 2660285 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2660285 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2660285' 00:26:28.572 killing process with pid 2660285 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 2660285 00:26:28.572 [2024-07-12 16:02:48.934147] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:28.572 [2024-07-12 16:02:48.934181] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:28.572 [2024-07-12 16:02:48.934210] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:28.572 [2024-07-12 16:02:48.934215] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2719670 name raid_bdev1, state offline 00:26:28.572 16:02:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 2660285 00:26:28.572 [2024-07-12 16:02:48.943588] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:28.832 16:02:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:26:28.832 00:26:28.832 real 0m13.352s 00:26:28.832 user 0m24.760s 00:26:28.832 sys 0m2.011s 00:26:28.832 16:02:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:28.832 16:02:49 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.832 ************************************ 00:26:28.832 END TEST raid_superblock_test_4k 00:26:28.832 ************************************ 00:26:28.832 16:02:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:28.832 16:02:49 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:26:28.832 16:02:49 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:26:28.832 16:02:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:28.832 16:02:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:28.832 16:02:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:28.832 ************************************ 00:26:28.832 START TEST raid_rebuild_test_sb_4k 00:26:28.832 ************************************ 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2662789 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2662789 /var/tmp/spdk-raid.sock 00:26:28.832 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 2662789 ']' 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:28.833 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:28.833 16:02:49 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.833 [2024-07-12 16:02:49.203674] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:26:28.833 [2024-07-12 16:02:49.203723] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2662789 ] 00:26:28.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:28.833 Zero copy mechanism will not be used. 00:26:29.092 [2024-07-12 16:02:49.289939] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:29.092 [2024-07-12 16:02:49.354430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:29.092 [2024-07-12 16:02:49.398795] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:29.092 [2024-07-12 16:02:49.398821] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:29.662 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:29.662 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:26:29.662 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:29.662 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:26:30.288 BaseBdev1_malloc 00:26:30.288 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:30.547 [2024-07-12 16:02:50.757922] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:30.547 [2024-07-12 16:02:50.757960] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:30.547 [2024-07-12 16:02:50.757975] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1671010 00:26:30.547 [2024-07-12 16:02:50.757982] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:30.547 [2024-07-12 16:02:50.759258] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:30.547 [2024-07-12 16:02:50.759277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:30.547 BaseBdev1 00:26:30.547 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:30.547 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:26:30.547 BaseBdev2_malloc 00:26:30.547 16:02:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:30.806 [2024-07-12 16:02:51.132701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:30.806 [2024-07-12 16:02:51.132732] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:30.806 [2024-07-12 16:02:51.132745] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1671c30 00:26:30.806 [2024-07-12 16:02:51.132751] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:30.806 [2024-07-12 16:02:51.133892] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:30.806 [2024-07-12 16:02:51.133910] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:30.806 BaseBdev2 00:26:30.806 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:31.065 spare_malloc 00:26:31.065 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:31.065 spare_delay 00:26:31.325 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:31.325 [2024-07-12 16:02:51.687685] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:31.325 [2024-07-12 16:02:51.687707] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:31.325 [2024-07-12 16:02:51.687721] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18227f0 00:26:31.325 [2024-07-12 16:02:51.687727] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:31.325 [2024-07-12 16:02:51.688864] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:31.325 [2024-07-12 16:02:51.688882] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:31.325 spare 00:26:31.325 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:31.585 [2024-07-12 16:02:51.880189] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:31.585 [2024-07-12 16:02:51.881145] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:31.585 [2024-07-12 16:02:51.881259] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1669d30 00:26:31.585 [2024-07-12 16:02:51.881267] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:31.585 [2024-07-12 16:02:51.881403] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16699c0 00:26:31.585 [2024-07-12 16:02:51.881510] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1669d30 00:26:31.585 [2024-07-12 16:02:51.881515] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1669d30 00:26:31.585 [2024-07-12 16:02:51.881582] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.585 16:02:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.846 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.846 "name": "raid_bdev1", 00:26:31.846 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:31.846 "strip_size_kb": 0, 00:26:31.846 "state": "online", 00:26:31.846 "raid_level": "raid1", 00:26:31.846 "superblock": true, 00:26:31.846 "num_base_bdevs": 2, 00:26:31.846 "num_base_bdevs_discovered": 2, 00:26:31.846 "num_base_bdevs_operational": 2, 00:26:31.846 "base_bdevs_list": [ 00:26:31.846 { 00:26:31.846 "name": "BaseBdev1", 00:26:31.846 "uuid": "40d7b4eb-4b34-57e6-b242-e1971cb6cec9", 00:26:31.846 "is_configured": true, 00:26:31.846 "data_offset": 256, 00:26:31.846 "data_size": 7936 00:26:31.846 }, 00:26:31.846 { 00:26:31.846 "name": "BaseBdev2", 00:26:31.846 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:31.846 "is_configured": true, 00:26:31.846 "data_offset": 256, 00:26:31.846 "data_size": 7936 00:26:31.846 } 00:26:31.846 ] 00:26:31.846 }' 00:26:31.846 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.846 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:32.416 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:32.416 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:32.416 [2024-07-12 16:02:52.782657] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:32.416 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:32.416 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.416 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:32.674 16:02:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:32.674 [2024-07-12 16:02:53.099284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16699c0 00:26:32.674 /dev/nbd0 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:32.934 1+0 records in 00:26:32.934 1+0 records out 00:26:32.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000168508 s, 24.3 MB/s 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:32.934 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:33.504 7936+0 records in 00:26:33.504 7936+0 records out 00:26:33.504 32505856 bytes (33 MB, 31 MiB) copied, 0.617563 s, 52.6 MB/s 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:33.504 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:33.764 [2024-07-12 16:02:53.976405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:33.764 16:02:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:33.764 [2024-07-12 16:02:54.159421] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.764 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.025 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.025 "name": "raid_bdev1", 00:26:34.025 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:34.025 "strip_size_kb": 0, 00:26:34.025 "state": "online", 00:26:34.025 "raid_level": "raid1", 00:26:34.025 "superblock": true, 00:26:34.025 "num_base_bdevs": 2, 00:26:34.025 "num_base_bdevs_discovered": 1, 00:26:34.025 "num_base_bdevs_operational": 1, 00:26:34.025 "base_bdevs_list": [ 00:26:34.025 { 00:26:34.025 "name": null, 00:26:34.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.025 "is_configured": false, 00:26:34.025 "data_offset": 256, 00:26:34.025 "data_size": 7936 00:26:34.025 }, 00:26:34.025 { 00:26:34.025 "name": "BaseBdev2", 00:26:34.025 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:34.025 "is_configured": true, 00:26:34.025 "data_offset": 256, 00:26:34.025 "data_size": 7936 00:26:34.025 } 00:26:34.025 ] 00:26:34.025 }' 00:26:34.025 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.025 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:34.595 16:02:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:34.854 [2024-07-12 16:02:55.069732] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:34.854 [2024-07-12 16:02:55.073007] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16687e0 00:26:34.854 [2024-07-12 16:02:55.074545] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:34.854 16:02:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:35.795 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:36.055 "name": "raid_bdev1", 00:26:36.055 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:36.055 "strip_size_kb": 0, 00:26:36.055 "state": "online", 00:26:36.055 "raid_level": "raid1", 00:26:36.055 "superblock": true, 00:26:36.055 "num_base_bdevs": 2, 00:26:36.055 "num_base_bdevs_discovered": 2, 00:26:36.055 "num_base_bdevs_operational": 2, 00:26:36.055 "process": { 00:26:36.055 "type": "rebuild", 00:26:36.055 "target": "spare", 00:26:36.055 "progress": { 00:26:36.055 "blocks": 3072, 00:26:36.055 "percent": 38 00:26:36.055 } 00:26:36.055 }, 00:26:36.055 "base_bdevs_list": [ 00:26:36.055 { 00:26:36.055 "name": "spare", 00:26:36.055 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:36.055 "is_configured": true, 00:26:36.055 "data_offset": 256, 00:26:36.055 "data_size": 7936 00:26:36.055 }, 00:26:36.055 { 00:26:36.055 "name": "BaseBdev2", 00:26:36.055 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:36.055 "is_configured": true, 00:26:36.055 "data_offset": 256, 00:26:36.055 "data_size": 7936 00:26:36.055 } 00:26:36.055 ] 00:26:36.055 }' 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:36.055 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:36.315 [2024-07-12 16:02:56.575122] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:36.315 [2024-07-12 16:02:56.583419] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:36.315 [2024-07-12 16:02:56.583452] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:36.315 [2024-07-12 16:02:56.583462] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:36.315 [2024-07-12 16:02:56.583466] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.315 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.574 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:36.574 "name": "raid_bdev1", 00:26:36.574 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:36.574 "strip_size_kb": 0, 00:26:36.574 "state": "online", 00:26:36.574 "raid_level": "raid1", 00:26:36.574 "superblock": true, 00:26:36.574 "num_base_bdevs": 2, 00:26:36.574 "num_base_bdevs_discovered": 1, 00:26:36.574 "num_base_bdevs_operational": 1, 00:26:36.574 "base_bdevs_list": [ 00:26:36.574 { 00:26:36.574 "name": null, 00:26:36.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:36.574 "is_configured": false, 00:26:36.574 "data_offset": 256, 00:26:36.574 "data_size": 7936 00:26:36.574 }, 00:26:36.574 { 00:26:36.574 "name": "BaseBdev2", 00:26:36.574 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:36.574 "is_configured": true, 00:26:36.574 "data_offset": 256, 00:26:36.574 "data_size": 7936 00:26:36.574 } 00:26:36.574 ] 00:26:36.574 }' 00:26:36.574 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:36.574 16:02:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:37.144 "name": "raid_bdev1", 00:26:37.144 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:37.144 "strip_size_kb": 0, 00:26:37.144 "state": "online", 00:26:37.144 "raid_level": "raid1", 00:26:37.144 "superblock": true, 00:26:37.144 "num_base_bdevs": 2, 00:26:37.144 "num_base_bdevs_discovered": 1, 00:26:37.144 "num_base_bdevs_operational": 1, 00:26:37.144 "base_bdevs_list": [ 00:26:37.144 { 00:26:37.144 "name": null, 00:26:37.144 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.144 "is_configured": false, 00:26:37.144 "data_offset": 256, 00:26:37.144 "data_size": 7936 00:26:37.144 }, 00:26:37.144 { 00:26:37.144 "name": "BaseBdev2", 00:26:37.144 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:37.144 "is_configured": true, 00:26:37.144 "data_offset": 256, 00:26:37.144 "data_size": 7936 00:26:37.144 } 00:26:37.144 ] 00:26:37.144 }' 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:37.144 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:37.404 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:37.404 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:37.404 [2024-07-12 16:02:57.770261] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:37.404 [2024-07-12 16:02:57.773623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1669720 00:26:37.404 [2024-07-12 16:02:57.774764] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:37.404 16:02:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:38.344 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:38.344 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.344 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:38.344 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:38.344 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.603 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.603 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.603 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.603 "name": "raid_bdev1", 00:26:38.603 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:38.603 "strip_size_kb": 0, 00:26:38.603 "state": "online", 00:26:38.603 "raid_level": "raid1", 00:26:38.603 "superblock": true, 00:26:38.603 "num_base_bdevs": 2, 00:26:38.603 "num_base_bdevs_discovered": 2, 00:26:38.603 "num_base_bdevs_operational": 2, 00:26:38.603 "process": { 00:26:38.603 "type": "rebuild", 00:26:38.603 "target": "spare", 00:26:38.603 "progress": { 00:26:38.603 "blocks": 2816, 00:26:38.603 "percent": 35 00:26:38.603 } 00:26:38.603 }, 00:26:38.603 "base_bdevs_list": [ 00:26:38.603 { 00:26:38.603 "name": "spare", 00:26:38.603 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:38.603 "is_configured": true, 00:26:38.603 "data_offset": 256, 00:26:38.603 "data_size": 7936 00:26:38.603 }, 00:26:38.603 { 00:26:38.603 "name": "BaseBdev2", 00:26:38.603 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:38.603 "is_configured": true, 00:26:38.603 "data_offset": 256, 00:26:38.603 "data_size": 7936 00:26:38.603 } 00:26:38.603 ] 00:26:38.603 }' 00:26:38.603 16:02:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.603 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:38.603 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:38.864 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=925 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.864 "name": "raid_bdev1", 00:26:38.864 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:38.864 "strip_size_kb": 0, 00:26:38.864 "state": "online", 00:26:38.864 "raid_level": "raid1", 00:26:38.864 "superblock": true, 00:26:38.864 "num_base_bdevs": 2, 00:26:38.864 "num_base_bdevs_discovered": 2, 00:26:38.864 "num_base_bdevs_operational": 2, 00:26:38.864 "process": { 00:26:38.864 "type": "rebuild", 00:26:38.864 "target": "spare", 00:26:38.864 "progress": { 00:26:38.864 "blocks": 3584, 00:26:38.864 "percent": 45 00:26:38.864 } 00:26:38.864 }, 00:26:38.864 "base_bdevs_list": [ 00:26:38.864 { 00:26:38.864 "name": "spare", 00:26:38.864 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:38.864 "is_configured": true, 00:26:38.864 "data_offset": 256, 00:26:38.864 "data_size": 7936 00:26:38.864 }, 00:26:38.864 { 00:26:38.864 "name": "BaseBdev2", 00:26:38.864 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:38.864 "is_configured": true, 00:26:38.864 "data_offset": 256, 00:26:38.864 "data_size": 7936 00:26:38.864 } 00:26:38.864 ] 00:26:38.864 }' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:38.864 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.125 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:39.125 16:02:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.064 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.324 "name": "raid_bdev1", 00:26:40.324 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:40.324 "strip_size_kb": 0, 00:26:40.324 "state": "online", 00:26:40.324 "raid_level": "raid1", 00:26:40.324 "superblock": true, 00:26:40.324 "num_base_bdevs": 2, 00:26:40.324 "num_base_bdevs_discovered": 2, 00:26:40.324 "num_base_bdevs_operational": 2, 00:26:40.324 "process": { 00:26:40.324 "type": "rebuild", 00:26:40.324 "target": "spare", 00:26:40.324 "progress": { 00:26:40.324 "blocks": 6912, 00:26:40.324 "percent": 87 00:26:40.324 } 00:26:40.324 }, 00:26:40.324 "base_bdevs_list": [ 00:26:40.324 { 00:26:40.324 "name": "spare", 00:26:40.324 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:40.324 "is_configured": true, 00:26:40.324 "data_offset": 256, 00:26:40.324 "data_size": 7936 00:26:40.324 }, 00:26:40.324 { 00:26:40.324 "name": "BaseBdev2", 00:26:40.324 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:40.324 "is_configured": true, 00:26:40.324 "data_offset": 256, 00:26:40.324 "data_size": 7936 00:26:40.324 } 00:26:40.324 ] 00:26:40.324 }' 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:40.324 16:03:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:40.584 [2024-07-12 16:03:00.892878] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:40.584 [2024-07-12 16:03:00.892922] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:40.584 [2024-07-12 16:03:00.892990] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.523 "name": "raid_bdev1", 00:26:41.523 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:41.523 "strip_size_kb": 0, 00:26:41.523 "state": "online", 00:26:41.523 "raid_level": "raid1", 00:26:41.523 "superblock": true, 00:26:41.523 "num_base_bdevs": 2, 00:26:41.523 "num_base_bdevs_discovered": 2, 00:26:41.523 "num_base_bdevs_operational": 2, 00:26:41.523 "base_bdevs_list": [ 00:26:41.523 { 00:26:41.523 "name": "spare", 00:26:41.523 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:41.523 "is_configured": true, 00:26:41.523 "data_offset": 256, 00:26:41.523 "data_size": 7936 00:26:41.523 }, 00:26:41.523 { 00:26:41.523 "name": "BaseBdev2", 00:26:41.523 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:41.523 "is_configured": true, 00:26:41.523 "data_offset": 256, 00:26:41.523 "data_size": 7936 00:26:41.523 } 00:26:41.523 ] 00:26:41.523 }' 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.523 16:03:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:41.782 "name": "raid_bdev1", 00:26:41.782 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:41.782 "strip_size_kb": 0, 00:26:41.782 "state": "online", 00:26:41.782 "raid_level": "raid1", 00:26:41.782 "superblock": true, 00:26:41.782 "num_base_bdevs": 2, 00:26:41.782 "num_base_bdevs_discovered": 2, 00:26:41.782 "num_base_bdevs_operational": 2, 00:26:41.782 "base_bdevs_list": [ 00:26:41.782 { 00:26:41.782 "name": "spare", 00:26:41.782 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:41.782 "is_configured": true, 00:26:41.782 "data_offset": 256, 00:26:41.782 "data_size": 7936 00:26:41.782 }, 00:26:41.782 { 00:26:41.782 "name": "BaseBdev2", 00:26:41.782 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:41.782 "is_configured": true, 00:26:41.782 "data_offset": 256, 00:26:41.782 "data_size": 7936 00:26:41.782 } 00:26:41.782 ] 00:26:41.782 }' 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.782 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.042 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.042 "name": "raid_bdev1", 00:26:42.042 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:42.042 "strip_size_kb": 0, 00:26:42.042 "state": "online", 00:26:42.042 "raid_level": "raid1", 00:26:42.042 "superblock": true, 00:26:42.042 "num_base_bdevs": 2, 00:26:42.042 "num_base_bdevs_discovered": 2, 00:26:42.042 "num_base_bdevs_operational": 2, 00:26:42.042 "base_bdevs_list": [ 00:26:42.042 { 00:26:42.042 "name": "spare", 00:26:42.042 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:42.042 "is_configured": true, 00:26:42.042 "data_offset": 256, 00:26:42.042 "data_size": 7936 00:26:42.042 }, 00:26:42.042 { 00:26:42.042 "name": "BaseBdev2", 00:26:42.042 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:42.042 "is_configured": true, 00:26:42.042 "data_offset": 256, 00:26:42.042 "data_size": 7936 00:26:42.042 } 00:26:42.042 ] 00:26:42.042 }' 00:26:42.042 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.042 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:42.611 16:03:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:42.908 [2024-07-12 16:03:03.121486] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:42.908 [2024-07-12 16:03:03.121506] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:42.908 [2024-07-12 16:03:03.121551] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:42.908 [2024-07-12 16:03:03.121591] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:42.908 [2024-07-12 16:03:03.121597] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1669d30 name raid_bdev1, state offline 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:42.908 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:43.168 /dev/nbd0 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:43.168 1+0 records in 00:26:43.168 1+0 records out 00:26:43.168 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306227 s, 13.4 MB/s 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:43.168 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:43.427 /dev/nbd1 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:43.427 1+0 records in 00:26:43.427 1+0 records out 00:26:43.427 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304162 s, 13.5 MB/s 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:43.427 16:03:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:43.685 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:43.945 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:44.204 [2024-07-12 16:03:04.621031] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:44.204 [2024-07-12 16:03:04.621065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:44.204 [2024-07-12 16:03:04.621077] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1667d30 00:26:44.204 [2024-07-12 16:03:04.621084] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:44.204 [2024-07-12 16:03:04.622405] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:44.204 [2024-07-12 16:03:04.622428] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:44.204 [2024-07-12 16:03:04.622488] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:44.204 [2024-07-12 16:03:04.622508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:44.204 [2024-07-12 16:03:04.622589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:44.204 spare 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.204 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.464 [2024-07-12 16:03:04.722879] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1819e10 00:26:44.464 [2024-07-12 16:03:04.722888] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:44.464 [2024-07-12 16:03:04.723050] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x181a2c0 00:26:44.464 [2024-07-12 16:03:04.723172] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1819e10 00:26:44.464 [2024-07-12 16:03:04.723178] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1819e10 00:26:44.464 [2024-07-12 16:03:04.723265] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:44.464 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:44.464 "name": "raid_bdev1", 00:26:44.464 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:44.464 "strip_size_kb": 0, 00:26:44.464 "state": "online", 00:26:44.464 "raid_level": "raid1", 00:26:44.464 "superblock": true, 00:26:44.464 "num_base_bdevs": 2, 00:26:44.464 "num_base_bdevs_discovered": 2, 00:26:44.464 "num_base_bdevs_operational": 2, 00:26:44.464 "base_bdevs_list": [ 00:26:44.464 { 00:26:44.464 "name": "spare", 00:26:44.464 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:44.464 "is_configured": true, 00:26:44.464 "data_offset": 256, 00:26:44.464 "data_size": 7936 00:26:44.464 }, 00:26:44.464 { 00:26:44.464 "name": "BaseBdev2", 00:26:44.464 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:44.464 "is_configured": true, 00:26:44.464 "data_offset": 256, 00:26:44.464 "data_size": 7936 00:26:44.464 } 00:26:44.464 ] 00:26:44.464 }' 00:26:44.464 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:44.464 16:03:04 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.033 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:45.293 "name": "raid_bdev1", 00:26:45.293 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:45.293 "strip_size_kb": 0, 00:26:45.293 "state": "online", 00:26:45.293 "raid_level": "raid1", 00:26:45.293 "superblock": true, 00:26:45.293 "num_base_bdevs": 2, 00:26:45.293 "num_base_bdevs_discovered": 2, 00:26:45.293 "num_base_bdevs_operational": 2, 00:26:45.293 "base_bdevs_list": [ 00:26:45.293 { 00:26:45.293 "name": "spare", 00:26:45.293 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:45.293 "is_configured": true, 00:26:45.293 "data_offset": 256, 00:26:45.293 "data_size": 7936 00:26:45.293 }, 00:26:45.293 { 00:26:45.293 "name": "BaseBdev2", 00:26:45.293 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:45.293 "is_configured": true, 00:26:45.293 "data_offset": 256, 00:26:45.293 "data_size": 7936 00:26:45.293 } 00:26:45.293 ] 00:26:45.293 }' 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.293 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:45.553 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:45.553 16:03:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:45.813 [2024-07-12 16:03:06.008634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:45.813 "name": "raid_bdev1", 00:26:45.813 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:45.813 "strip_size_kb": 0, 00:26:45.813 "state": "online", 00:26:45.813 "raid_level": "raid1", 00:26:45.813 "superblock": true, 00:26:45.813 "num_base_bdevs": 2, 00:26:45.813 "num_base_bdevs_discovered": 1, 00:26:45.813 "num_base_bdevs_operational": 1, 00:26:45.813 "base_bdevs_list": [ 00:26:45.813 { 00:26:45.813 "name": null, 00:26:45.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:45.813 "is_configured": false, 00:26:45.813 "data_offset": 256, 00:26:45.813 "data_size": 7936 00:26:45.813 }, 00:26:45.813 { 00:26:45.813 "name": "BaseBdev2", 00:26:45.813 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:45.813 "is_configured": true, 00:26:45.813 "data_offset": 256, 00:26:45.813 "data_size": 7936 00:26:45.813 } 00:26:45.813 ] 00:26:45.813 }' 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:45.813 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:46.383 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:46.642 [2024-07-12 16:03:06.910923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:46.642 [2024-07-12 16:03:06.911037] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:46.642 [2024-07-12 16:03:06.911047] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:46.642 [2024-07-12 16:03:06.911065] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:46.642 [2024-07-12 16:03:06.914423] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x166a010 00:26:46.642 [2024-07-12 16:03:06.915489] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:46.642 16:03:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.581 16:03:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:47.841 "name": "raid_bdev1", 00:26:47.841 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:47.841 "strip_size_kb": 0, 00:26:47.841 "state": "online", 00:26:47.841 "raid_level": "raid1", 00:26:47.841 "superblock": true, 00:26:47.841 "num_base_bdevs": 2, 00:26:47.841 "num_base_bdevs_discovered": 2, 00:26:47.841 "num_base_bdevs_operational": 2, 00:26:47.841 "process": { 00:26:47.841 "type": "rebuild", 00:26:47.841 "target": "spare", 00:26:47.841 "progress": { 00:26:47.841 "blocks": 2816, 00:26:47.841 "percent": 35 00:26:47.841 } 00:26:47.841 }, 00:26:47.841 "base_bdevs_list": [ 00:26:47.841 { 00:26:47.841 "name": "spare", 00:26:47.841 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:47.841 "is_configured": true, 00:26:47.841 "data_offset": 256, 00:26:47.841 "data_size": 7936 00:26:47.841 }, 00:26:47.841 { 00:26:47.841 "name": "BaseBdev2", 00:26:47.841 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:47.841 "is_configured": true, 00:26:47.841 "data_offset": 256, 00:26:47.841 "data_size": 7936 00:26:47.841 } 00:26:47.841 ] 00:26:47.841 }' 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:47.841 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:48.102 [2024-07-12 16:03:08.392134] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:48.102 [2024-07-12 16:03:08.424463] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:48.102 [2024-07-12 16:03:08.424496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:48.102 [2024-07-12 16:03:08.424505] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:48.102 [2024-07-12 16:03:08.424510] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:48.102 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.361 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:48.361 "name": "raid_bdev1", 00:26:48.361 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:48.361 "strip_size_kb": 0, 00:26:48.361 "state": "online", 00:26:48.361 "raid_level": "raid1", 00:26:48.361 "superblock": true, 00:26:48.361 "num_base_bdevs": 2, 00:26:48.361 "num_base_bdevs_discovered": 1, 00:26:48.361 "num_base_bdevs_operational": 1, 00:26:48.361 "base_bdevs_list": [ 00:26:48.361 { 00:26:48.361 "name": null, 00:26:48.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:48.361 "is_configured": false, 00:26:48.361 "data_offset": 256, 00:26:48.361 "data_size": 7936 00:26:48.361 }, 00:26:48.361 { 00:26:48.361 "name": "BaseBdev2", 00:26:48.361 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:48.361 "is_configured": true, 00:26:48.361 "data_offset": 256, 00:26:48.361 "data_size": 7936 00:26:48.361 } 00:26:48.361 ] 00:26:48.361 }' 00:26:48.361 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:48.361 16:03:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:48.931 16:03:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:48.931 [2024-07-12 16:03:09.326640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:48.931 [2024-07-12 16:03:09.326676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:48.931 [2024-07-12 16:03:09.326689] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16693c0 00:26:48.931 [2024-07-12 16:03:09.326696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:48.931 [2024-07-12 16:03:09.327015] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:48.931 [2024-07-12 16:03:09.327027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:48.931 [2024-07-12 16:03:09.327086] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:48.931 [2024-07-12 16:03:09.327094] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:48.931 [2024-07-12 16:03:09.327105] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:48.931 [2024-07-12 16:03:09.327117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:48.931 [2024-07-12 16:03:09.330438] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1669650 00:26:48.931 [2024-07-12 16:03:09.331498] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:48.931 spare 00:26:48.931 16:03:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:50.313 "name": "raid_bdev1", 00:26:50.313 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:50.313 "strip_size_kb": 0, 00:26:50.313 "state": "online", 00:26:50.313 "raid_level": "raid1", 00:26:50.313 "superblock": true, 00:26:50.313 "num_base_bdevs": 2, 00:26:50.313 "num_base_bdevs_discovered": 2, 00:26:50.313 "num_base_bdevs_operational": 2, 00:26:50.313 "process": { 00:26:50.313 "type": "rebuild", 00:26:50.313 "target": "spare", 00:26:50.313 "progress": { 00:26:50.313 "blocks": 2816, 00:26:50.313 "percent": 35 00:26:50.313 } 00:26:50.313 }, 00:26:50.313 "base_bdevs_list": [ 00:26:50.313 { 00:26:50.313 "name": "spare", 00:26:50.313 "uuid": "804190ac-687c-5a80-9132-a62ada9deb56", 00:26:50.313 "is_configured": true, 00:26:50.313 "data_offset": 256, 00:26:50.313 "data_size": 7936 00:26:50.313 }, 00:26:50.313 { 00:26:50.313 "name": "BaseBdev2", 00:26:50.313 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:50.313 "is_configured": true, 00:26:50.313 "data_offset": 256, 00:26:50.313 "data_size": 7936 00:26:50.313 } 00:26:50.313 ] 00:26:50.313 }' 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:50.313 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:50.573 [2024-07-12 16:03:10.788274] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:50.573 [2024-07-12 16:03:10.840370] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:50.573 [2024-07-12 16:03:10.840403] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:50.573 [2024-07-12 16:03:10.840413] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:50.573 [2024-07-12 16:03:10.840417] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.573 16:03:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.833 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.833 "name": "raid_bdev1", 00:26:50.833 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:50.833 "strip_size_kb": 0, 00:26:50.833 "state": "online", 00:26:50.833 "raid_level": "raid1", 00:26:50.833 "superblock": true, 00:26:50.833 "num_base_bdevs": 2, 00:26:50.833 "num_base_bdevs_discovered": 1, 00:26:50.833 "num_base_bdevs_operational": 1, 00:26:50.833 "base_bdevs_list": [ 00:26:50.833 { 00:26:50.833 "name": null, 00:26:50.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:50.833 "is_configured": false, 00:26:50.833 "data_offset": 256, 00:26:50.833 "data_size": 7936 00:26:50.833 }, 00:26:50.833 { 00:26:50.833 "name": "BaseBdev2", 00:26:50.833 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:50.833 "is_configured": true, 00:26:50.833 "data_offset": 256, 00:26:50.833 "data_size": 7936 00:26:50.833 } 00:26:50.833 ] 00:26:50.833 }' 00:26:50.833 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.833 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.403 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:51.403 "name": "raid_bdev1", 00:26:51.403 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:51.403 "strip_size_kb": 0, 00:26:51.403 "state": "online", 00:26:51.403 "raid_level": "raid1", 00:26:51.403 "superblock": true, 00:26:51.403 "num_base_bdevs": 2, 00:26:51.403 "num_base_bdevs_discovered": 1, 00:26:51.403 "num_base_bdevs_operational": 1, 00:26:51.403 "base_bdevs_list": [ 00:26:51.403 { 00:26:51.403 "name": null, 00:26:51.403 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:51.403 "is_configured": false, 00:26:51.403 "data_offset": 256, 00:26:51.403 "data_size": 7936 00:26:51.403 }, 00:26:51.403 { 00:26:51.403 "name": "BaseBdev2", 00:26:51.404 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:51.404 "is_configured": true, 00:26:51.404 "data_offset": 256, 00:26:51.404 "data_size": 7936 00:26:51.404 } 00:26:51.404 ] 00:26:51.404 }' 00:26:51.404 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:51.404 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:51.404 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:51.662 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:51.662 16:03:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:51.662 16:03:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:51.922 [2024-07-12 16:03:12.243929] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:51.922 [2024-07-12 16:03:12.243965] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.922 [2024-07-12 16:03:12.243977] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1671240 00:26:51.922 [2024-07-12 16:03:12.243984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.922 [2024-07-12 16:03:12.244266] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.922 [2024-07-12 16:03:12.244277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:51.922 [2024-07-12 16:03:12.244322] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:51.922 [2024-07-12 16:03:12.244328] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:51.922 [2024-07-12 16:03:12.244333] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:51.922 BaseBdev1 00:26:51.922 16:03:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.864 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.124 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.124 "name": "raid_bdev1", 00:26:53.124 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:53.124 "strip_size_kb": 0, 00:26:53.124 "state": "online", 00:26:53.124 "raid_level": "raid1", 00:26:53.124 "superblock": true, 00:26:53.124 "num_base_bdevs": 2, 00:26:53.124 "num_base_bdevs_discovered": 1, 00:26:53.124 "num_base_bdevs_operational": 1, 00:26:53.124 "base_bdevs_list": [ 00:26:53.124 { 00:26:53.124 "name": null, 00:26:53.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.124 "is_configured": false, 00:26:53.124 "data_offset": 256, 00:26:53.124 "data_size": 7936 00:26:53.124 }, 00:26:53.124 { 00:26:53.124 "name": "BaseBdev2", 00:26:53.124 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:53.124 "is_configured": true, 00:26:53.124 "data_offset": 256, 00:26:53.124 "data_size": 7936 00:26:53.124 } 00:26:53.124 ] 00:26:53.124 }' 00:26:53.124 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.124 16:03:13 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.694 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.954 "name": "raid_bdev1", 00:26:53.954 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:53.954 "strip_size_kb": 0, 00:26:53.954 "state": "online", 00:26:53.954 "raid_level": "raid1", 00:26:53.954 "superblock": true, 00:26:53.954 "num_base_bdevs": 2, 00:26:53.954 "num_base_bdevs_discovered": 1, 00:26:53.954 "num_base_bdevs_operational": 1, 00:26:53.954 "base_bdevs_list": [ 00:26:53.954 { 00:26:53.954 "name": null, 00:26:53.954 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:53.954 "is_configured": false, 00:26:53.954 "data_offset": 256, 00:26:53.954 "data_size": 7936 00:26:53.954 }, 00:26:53.954 { 00:26:53.954 "name": "BaseBdev2", 00:26:53.954 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:53.954 "is_configured": true, 00:26:53.954 "data_offset": 256, 00:26:53.954 "data_size": 7936 00:26:53.954 } 00:26:53.954 ] 00:26:53.954 }' 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:53.954 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:54.215 [2024-07-12 16:03:14.485619] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:54.215 [2024-07-12 16:03:14.485717] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:54.215 [2024-07-12 16:03:14.485726] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:54.215 request: 00:26:54.215 { 00:26:54.215 "base_bdev": "BaseBdev1", 00:26:54.215 "raid_bdev": "raid_bdev1", 00:26:54.215 "method": "bdev_raid_add_base_bdev", 00:26:54.215 "req_id": 1 00:26:54.215 } 00:26:54.215 Got JSON-RPC error response 00:26:54.215 response: 00:26:54.215 { 00:26:54.215 "code": -22, 00:26:54.215 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:54.216 } 00:26:54.216 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:26:54.216 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:54.216 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:54.216 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:54.216 16:03:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.157 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.417 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.417 "name": "raid_bdev1", 00:26:55.417 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:55.417 "strip_size_kb": 0, 00:26:55.417 "state": "online", 00:26:55.417 "raid_level": "raid1", 00:26:55.417 "superblock": true, 00:26:55.417 "num_base_bdevs": 2, 00:26:55.417 "num_base_bdevs_discovered": 1, 00:26:55.417 "num_base_bdevs_operational": 1, 00:26:55.417 "base_bdevs_list": [ 00:26:55.417 { 00:26:55.417 "name": null, 00:26:55.417 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.417 "is_configured": false, 00:26:55.417 "data_offset": 256, 00:26:55.417 "data_size": 7936 00:26:55.417 }, 00:26:55.417 { 00:26:55.417 "name": "BaseBdev2", 00:26:55.417 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:55.417 "is_configured": true, 00:26:55.417 "data_offset": 256, 00:26:55.417 "data_size": 7936 00:26:55.417 } 00:26:55.417 ] 00:26:55.417 }' 00:26:55.417 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.417 16:03:15 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.985 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:56.246 "name": "raid_bdev1", 00:26:56.246 "uuid": "e55d4a3e-f20c-43d9-9c48-416588df1384", 00:26:56.246 "strip_size_kb": 0, 00:26:56.246 "state": "online", 00:26:56.246 "raid_level": "raid1", 00:26:56.246 "superblock": true, 00:26:56.246 "num_base_bdevs": 2, 00:26:56.246 "num_base_bdevs_discovered": 1, 00:26:56.246 "num_base_bdevs_operational": 1, 00:26:56.246 "base_bdevs_list": [ 00:26:56.246 { 00:26:56.246 "name": null, 00:26:56.246 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.246 "is_configured": false, 00:26:56.246 "data_offset": 256, 00:26:56.246 "data_size": 7936 00:26:56.246 }, 00:26:56.246 { 00:26:56.246 "name": "BaseBdev2", 00:26:56.246 "uuid": "49e75df5-7e68-5a30-8efc-5b48f73b1ad7", 00:26:56.246 "is_configured": true, 00:26:56.246 "data_offset": 256, 00:26:56.246 "data_size": 7936 00:26:56.246 } 00:26:56.246 ] 00:26:56.246 }' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2662789 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 2662789 ']' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 2662789 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2662789 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2662789' 00:26:56.246 killing process with pid 2662789 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 2662789 00:26:56.246 Received shutdown signal, test time was about 60.000000 seconds 00:26:56.246 00:26:56.246 Latency(us) 00:26:56.246 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:56.246 =================================================================================================================== 00:26:56.246 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:56.246 [2024-07-12 16:03:16.621854] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:56.246 [2024-07-12 16:03:16.621921] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:56.246 [2024-07-12 16:03:16.621951] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:56.246 [2024-07-12 16:03:16.621957] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1819e10 name raid_bdev1, state offline 00:26:56.246 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 2662789 00:26:56.246 [2024-07-12 16:03:16.636799] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:56.508 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:26:56.508 00:26:56.508 real 0m27.618s 00:26:56.508 user 0m43.411s 00:26:56.508 sys 0m3.429s 00:26:56.508 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:56.508 16:03:16 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:56.508 ************************************ 00:26:56.508 END TEST raid_rebuild_test_sb_4k 00:26:56.508 ************************************ 00:26:56.508 16:03:16 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:56.508 16:03:16 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:26:56.508 16:03:16 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:26:56.508 16:03:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:26:56.508 16:03:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:56.508 16:03:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:56.508 ************************************ 00:26:56.508 START TEST raid_state_function_test_sb_md_separate 00:26:56.508 ************************************ 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:56.508 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2668399 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2668399' 00:26:56.509 Process raid pid: 2668399 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2668399 /var/tmp/spdk-raid.sock 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2668399 ']' 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:56.509 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:56.509 16:03:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:56.509 [2024-07-12 16:03:16.898221] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:26:56.509 [2024-07-12 16:03:16.898263] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:56.769 [2024-07-12 16:03:16.985426] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:56.769 [2024-07-12 16:03:17.047611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:56.769 [2024-07-12 16:03:17.087660] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:56.770 [2024-07-12 16:03:17.087681] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:57.341 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:57.341 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:26:57.341 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:57.601 [2024-07-12 16:03:17.906665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:57.601 [2024-07-12 16:03:17.906692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:57.601 [2024-07-12 16:03:17.906698] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:57.601 [2024-07-12 16:03:17.906704] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.601 16:03:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:57.861 16:03:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.861 "name": "Existed_Raid", 00:26:57.861 "uuid": "4a3f545f-22e1-4458-8edb-a439f3dc2000", 00:26:57.861 "strip_size_kb": 0, 00:26:57.861 "state": "configuring", 00:26:57.861 "raid_level": "raid1", 00:26:57.861 "superblock": true, 00:26:57.861 "num_base_bdevs": 2, 00:26:57.861 "num_base_bdevs_discovered": 0, 00:26:57.861 "num_base_bdevs_operational": 2, 00:26:57.861 "base_bdevs_list": [ 00:26:57.861 { 00:26:57.861 "name": "BaseBdev1", 00:26:57.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.861 "is_configured": false, 00:26:57.861 "data_offset": 0, 00:26:57.861 "data_size": 0 00:26:57.861 }, 00:26:57.861 { 00:26:57.861 "name": "BaseBdev2", 00:26:57.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.861 "is_configured": false, 00:26:57.861 "data_offset": 0, 00:26:57.861 "data_size": 0 00:26:57.861 } 00:26:57.861 ] 00:26:57.861 }' 00:26:57.861 16:03:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.861 16:03:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:58.431 16:03:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:58.431 [2024-07-12 16:03:18.844923] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:58.431 [2024-07-12 16:03:18.844940] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a40900 name Existed_Raid, state configuring 00:26:58.431 16:03:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:58.690 [2024-07-12 16:03:19.037431] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:58.690 [2024-07-12 16:03:19.037449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:58.690 [2024-07-12 16:03:19.037454] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:58.690 [2024-07-12 16:03:19.037460] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:58.690 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:26:58.949 [2024-07-12 16:03:19.228758] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:58.949 BaseBdev1 00:26:58.949 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:26:58.950 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:59.210 [ 00:26:59.210 { 00:26:59.210 "name": "BaseBdev1", 00:26:59.210 "aliases": [ 00:26:59.210 "4c8115f1-09a2-4a57-8210-dcbf71fbe583" 00:26:59.210 ], 00:26:59.210 "product_name": "Malloc disk", 00:26:59.210 "block_size": 4096, 00:26:59.210 "num_blocks": 8192, 00:26:59.210 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:26:59.210 "md_size": 32, 00:26:59.210 "md_interleave": false, 00:26:59.210 "dif_type": 0, 00:26:59.210 "assigned_rate_limits": { 00:26:59.210 "rw_ios_per_sec": 0, 00:26:59.210 "rw_mbytes_per_sec": 0, 00:26:59.210 "r_mbytes_per_sec": 0, 00:26:59.210 "w_mbytes_per_sec": 0 00:26:59.210 }, 00:26:59.210 "claimed": true, 00:26:59.210 "claim_type": "exclusive_write", 00:26:59.210 "zoned": false, 00:26:59.210 "supported_io_types": { 00:26:59.210 "read": true, 00:26:59.210 "write": true, 00:26:59.210 "unmap": true, 00:26:59.210 "flush": true, 00:26:59.210 "reset": true, 00:26:59.210 "nvme_admin": false, 00:26:59.210 "nvme_io": false, 00:26:59.210 "nvme_io_md": false, 00:26:59.210 "write_zeroes": true, 00:26:59.210 "zcopy": true, 00:26:59.210 "get_zone_info": false, 00:26:59.210 "zone_management": false, 00:26:59.210 "zone_append": false, 00:26:59.210 "compare": false, 00:26:59.210 "compare_and_write": false, 00:26:59.210 "abort": true, 00:26:59.210 "seek_hole": false, 00:26:59.210 "seek_data": false, 00:26:59.210 "copy": true, 00:26:59.210 "nvme_iov_md": false 00:26:59.210 }, 00:26:59.210 "memory_domains": [ 00:26:59.210 { 00:26:59.210 "dma_device_id": "system", 00:26:59.210 "dma_device_type": 1 00:26:59.210 }, 00:26:59.210 { 00:26:59.210 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:59.210 "dma_device_type": 2 00:26:59.210 } 00:26:59.210 ], 00:26:59.210 "driver_specific": {} 00:26:59.210 } 00:26:59.210 ] 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.210 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:59.470 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:59.470 "name": "Existed_Raid", 00:26:59.470 "uuid": "524ac885-eed9-499f-82a6-1fd95a503112", 00:26:59.470 "strip_size_kb": 0, 00:26:59.470 "state": "configuring", 00:26:59.470 "raid_level": "raid1", 00:26:59.470 "superblock": true, 00:26:59.470 "num_base_bdevs": 2, 00:26:59.470 "num_base_bdevs_discovered": 1, 00:26:59.470 "num_base_bdevs_operational": 2, 00:26:59.470 "base_bdevs_list": [ 00:26:59.470 { 00:26:59.470 "name": "BaseBdev1", 00:26:59.470 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:26:59.470 "is_configured": true, 00:26:59.470 "data_offset": 256, 00:26:59.470 "data_size": 7936 00:26:59.470 }, 00:26:59.470 { 00:26:59.470 "name": "BaseBdev2", 00:26:59.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.470 "is_configured": false, 00:26:59.470 "data_offset": 0, 00:26:59.470 "data_size": 0 00:26:59.470 } 00:26:59.470 ] 00:26:59.470 }' 00:26:59.470 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:59.470 16:03:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:00.048 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:00.353 [2024-07-12 16:03:20.568174] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:00.353 [2024-07-12 16:03:20.568205] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a401d0 name Existed_Raid, state configuring 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:00.353 [2024-07-12 16:03:20.756679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:00.353 [2024-07-12 16:03:20.757875] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:00.353 [2024-07-12 16:03:20.757898] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.353 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:00.612 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.612 "name": "Existed_Raid", 00:27:00.612 "uuid": "7ae5e479-57ff-48e3-b532-70141adc0a9b", 00:27:00.612 "strip_size_kb": 0, 00:27:00.612 "state": "configuring", 00:27:00.612 "raid_level": "raid1", 00:27:00.612 "superblock": true, 00:27:00.612 "num_base_bdevs": 2, 00:27:00.612 "num_base_bdevs_discovered": 1, 00:27:00.612 "num_base_bdevs_operational": 2, 00:27:00.612 "base_bdevs_list": [ 00:27:00.612 { 00:27:00.612 "name": "BaseBdev1", 00:27:00.612 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:27:00.612 "is_configured": true, 00:27:00.612 "data_offset": 256, 00:27:00.612 "data_size": 7936 00:27:00.612 }, 00:27:00.612 { 00:27:00.612 "name": "BaseBdev2", 00:27:00.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.612 "is_configured": false, 00:27:00.612 "data_offset": 0, 00:27:00.612 "data_size": 0 00:27:00.612 } 00:27:00.612 ] 00:27:00.612 }' 00:27:00.612 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.612 16:03:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:01.181 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:27:01.443 [2024-07-12 16:03:21.700515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:01.443 [2024-07-12 16:03:21.700624] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a42090 00:27:01.443 [2024-07-12 16:03:21.700632] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:01.443 [2024-07-12 16:03:21.700675] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a41ad0 00:27:01.443 [2024-07-12 16:03:21.700754] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a42090 00:27:01.443 [2024-07-12 16:03:21.700760] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a42090 00:27:01.443 [2024-07-12 16:03:21.700809] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:01.443 BaseBdev2 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:01.443 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:01.703 16:03:21 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:01.703 [ 00:27:01.703 { 00:27:01.703 "name": "BaseBdev2", 00:27:01.703 "aliases": [ 00:27:01.703 "b541294a-1131-4297-9117-bc6a033fe0ce" 00:27:01.703 ], 00:27:01.703 "product_name": "Malloc disk", 00:27:01.703 "block_size": 4096, 00:27:01.703 "num_blocks": 8192, 00:27:01.703 "uuid": "b541294a-1131-4297-9117-bc6a033fe0ce", 00:27:01.703 "md_size": 32, 00:27:01.703 "md_interleave": false, 00:27:01.703 "dif_type": 0, 00:27:01.703 "assigned_rate_limits": { 00:27:01.703 "rw_ios_per_sec": 0, 00:27:01.703 "rw_mbytes_per_sec": 0, 00:27:01.703 "r_mbytes_per_sec": 0, 00:27:01.703 "w_mbytes_per_sec": 0 00:27:01.703 }, 00:27:01.703 "claimed": true, 00:27:01.703 "claim_type": "exclusive_write", 00:27:01.703 "zoned": false, 00:27:01.703 "supported_io_types": { 00:27:01.703 "read": true, 00:27:01.703 "write": true, 00:27:01.703 "unmap": true, 00:27:01.703 "flush": true, 00:27:01.703 "reset": true, 00:27:01.703 "nvme_admin": false, 00:27:01.703 "nvme_io": false, 00:27:01.703 "nvme_io_md": false, 00:27:01.703 "write_zeroes": true, 00:27:01.703 "zcopy": true, 00:27:01.703 "get_zone_info": false, 00:27:01.703 "zone_management": false, 00:27:01.703 "zone_append": false, 00:27:01.703 "compare": false, 00:27:01.703 "compare_and_write": false, 00:27:01.703 "abort": true, 00:27:01.703 "seek_hole": false, 00:27:01.703 "seek_data": false, 00:27:01.703 "copy": true, 00:27:01.703 "nvme_iov_md": false 00:27:01.703 }, 00:27:01.703 "memory_domains": [ 00:27:01.703 { 00:27:01.703 "dma_device_id": "system", 00:27:01.703 "dma_device_type": 1 00:27:01.703 }, 00:27:01.703 { 00:27:01.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:01.703 "dma_device_type": 2 00:27:01.703 } 00:27:01.703 ], 00:27:01.703 "driver_specific": {} 00:27:01.703 } 00:27:01.703 ] 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.703 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:01.963 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:01.963 "name": "Existed_Raid", 00:27:01.963 "uuid": "7ae5e479-57ff-48e3-b532-70141adc0a9b", 00:27:01.963 "strip_size_kb": 0, 00:27:01.964 "state": "online", 00:27:01.964 "raid_level": "raid1", 00:27:01.964 "superblock": true, 00:27:01.964 "num_base_bdevs": 2, 00:27:01.964 "num_base_bdevs_discovered": 2, 00:27:01.964 "num_base_bdevs_operational": 2, 00:27:01.964 "base_bdevs_list": [ 00:27:01.964 { 00:27:01.964 "name": "BaseBdev1", 00:27:01.964 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:27:01.964 "is_configured": true, 00:27:01.964 "data_offset": 256, 00:27:01.964 "data_size": 7936 00:27:01.964 }, 00:27:01.964 { 00:27:01.964 "name": "BaseBdev2", 00:27:01.964 "uuid": "b541294a-1131-4297-9117-bc6a033fe0ce", 00:27:01.964 "is_configured": true, 00:27:01.964 "data_offset": 256, 00:27:01.964 "data_size": 7936 00:27:01.964 } 00:27:01.964 ] 00:27:01.964 }' 00:27:01.964 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:01.964 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:02.533 16:03:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:02.792 [2024-07-12 16:03:23.028108] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:02.792 "name": "Existed_Raid", 00:27:02.792 "aliases": [ 00:27:02.792 "7ae5e479-57ff-48e3-b532-70141adc0a9b" 00:27:02.792 ], 00:27:02.792 "product_name": "Raid Volume", 00:27:02.792 "block_size": 4096, 00:27:02.792 "num_blocks": 7936, 00:27:02.792 "uuid": "7ae5e479-57ff-48e3-b532-70141adc0a9b", 00:27:02.792 "md_size": 32, 00:27:02.792 "md_interleave": false, 00:27:02.792 "dif_type": 0, 00:27:02.792 "assigned_rate_limits": { 00:27:02.792 "rw_ios_per_sec": 0, 00:27:02.792 "rw_mbytes_per_sec": 0, 00:27:02.792 "r_mbytes_per_sec": 0, 00:27:02.792 "w_mbytes_per_sec": 0 00:27:02.792 }, 00:27:02.792 "claimed": false, 00:27:02.792 "zoned": false, 00:27:02.792 "supported_io_types": { 00:27:02.792 "read": true, 00:27:02.792 "write": true, 00:27:02.792 "unmap": false, 00:27:02.792 "flush": false, 00:27:02.792 "reset": true, 00:27:02.792 "nvme_admin": false, 00:27:02.792 "nvme_io": false, 00:27:02.792 "nvme_io_md": false, 00:27:02.792 "write_zeroes": true, 00:27:02.792 "zcopy": false, 00:27:02.792 "get_zone_info": false, 00:27:02.792 "zone_management": false, 00:27:02.792 "zone_append": false, 00:27:02.792 "compare": false, 00:27:02.792 "compare_and_write": false, 00:27:02.792 "abort": false, 00:27:02.792 "seek_hole": false, 00:27:02.792 "seek_data": false, 00:27:02.792 "copy": false, 00:27:02.792 "nvme_iov_md": false 00:27:02.792 }, 00:27:02.792 "memory_domains": [ 00:27:02.792 { 00:27:02.792 "dma_device_id": "system", 00:27:02.792 "dma_device_type": 1 00:27:02.792 }, 00:27:02.792 { 00:27:02.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:02.792 "dma_device_type": 2 00:27:02.792 }, 00:27:02.792 { 00:27:02.792 "dma_device_id": "system", 00:27:02.792 "dma_device_type": 1 00:27:02.792 }, 00:27:02.792 { 00:27:02.792 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:02.792 "dma_device_type": 2 00:27:02.792 } 00:27:02.792 ], 00:27:02.792 "driver_specific": { 00:27:02.792 "raid": { 00:27:02.792 "uuid": "7ae5e479-57ff-48e3-b532-70141adc0a9b", 00:27:02.792 "strip_size_kb": 0, 00:27:02.792 "state": "online", 00:27:02.792 "raid_level": "raid1", 00:27:02.792 "superblock": true, 00:27:02.792 "num_base_bdevs": 2, 00:27:02.792 "num_base_bdevs_discovered": 2, 00:27:02.792 "num_base_bdevs_operational": 2, 00:27:02.792 "base_bdevs_list": [ 00:27:02.792 { 00:27:02.792 "name": "BaseBdev1", 00:27:02.792 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:27:02.792 "is_configured": true, 00:27:02.792 "data_offset": 256, 00:27:02.792 "data_size": 7936 00:27:02.792 }, 00:27:02.792 { 00:27:02.792 "name": "BaseBdev2", 00:27:02.792 "uuid": "b541294a-1131-4297-9117-bc6a033fe0ce", 00:27:02.792 "is_configured": true, 00:27:02.792 "data_offset": 256, 00:27:02.792 "data_size": 7936 00:27:02.792 } 00:27:02.792 ] 00:27:02.792 } 00:27:02.792 } 00:27:02.792 }' 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:02.792 BaseBdev2' 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:02.792 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:03.052 "name": "BaseBdev1", 00:27:03.052 "aliases": [ 00:27:03.052 "4c8115f1-09a2-4a57-8210-dcbf71fbe583" 00:27:03.052 ], 00:27:03.052 "product_name": "Malloc disk", 00:27:03.052 "block_size": 4096, 00:27:03.052 "num_blocks": 8192, 00:27:03.052 "uuid": "4c8115f1-09a2-4a57-8210-dcbf71fbe583", 00:27:03.052 "md_size": 32, 00:27:03.052 "md_interleave": false, 00:27:03.052 "dif_type": 0, 00:27:03.052 "assigned_rate_limits": { 00:27:03.052 "rw_ios_per_sec": 0, 00:27:03.052 "rw_mbytes_per_sec": 0, 00:27:03.052 "r_mbytes_per_sec": 0, 00:27:03.052 "w_mbytes_per_sec": 0 00:27:03.052 }, 00:27:03.052 "claimed": true, 00:27:03.052 "claim_type": "exclusive_write", 00:27:03.052 "zoned": false, 00:27:03.052 "supported_io_types": { 00:27:03.052 "read": true, 00:27:03.052 "write": true, 00:27:03.052 "unmap": true, 00:27:03.052 "flush": true, 00:27:03.052 "reset": true, 00:27:03.052 "nvme_admin": false, 00:27:03.052 "nvme_io": false, 00:27:03.052 "nvme_io_md": false, 00:27:03.052 "write_zeroes": true, 00:27:03.052 "zcopy": true, 00:27:03.052 "get_zone_info": false, 00:27:03.052 "zone_management": false, 00:27:03.052 "zone_append": false, 00:27:03.052 "compare": false, 00:27:03.052 "compare_and_write": false, 00:27:03.052 "abort": true, 00:27:03.052 "seek_hole": false, 00:27:03.052 "seek_data": false, 00:27:03.052 "copy": true, 00:27:03.052 "nvme_iov_md": false 00:27:03.052 }, 00:27:03.052 "memory_domains": [ 00:27:03.052 { 00:27:03.052 "dma_device_id": "system", 00:27:03.052 "dma_device_type": 1 00:27:03.052 }, 00:27:03.052 { 00:27:03.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.052 "dma_device_type": 2 00:27:03.052 } 00:27:03.052 ], 00:27:03.052 "driver_specific": {} 00:27:03.052 }' 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:03.052 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:03.312 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:03.572 "name": "BaseBdev2", 00:27:03.572 "aliases": [ 00:27:03.572 "b541294a-1131-4297-9117-bc6a033fe0ce" 00:27:03.572 ], 00:27:03.572 "product_name": "Malloc disk", 00:27:03.572 "block_size": 4096, 00:27:03.572 "num_blocks": 8192, 00:27:03.572 "uuid": "b541294a-1131-4297-9117-bc6a033fe0ce", 00:27:03.572 "md_size": 32, 00:27:03.572 "md_interleave": false, 00:27:03.572 "dif_type": 0, 00:27:03.572 "assigned_rate_limits": { 00:27:03.572 "rw_ios_per_sec": 0, 00:27:03.572 "rw_mbytes_per_sec": 0, 00:27:03.572 "r_mbytes_per_sec": 0, 00:27:03.572 "w_mbytes_per_sec": 0 00:27:03.572 }, 00:27:03.572 "claimed": true, 00:27:03.572 "claim_type": "exclusive_write", 00:27:03.572 "zoned": false, 00:27:03.572 "supported_io_types": { 00:27:03.572 "read": true, 00:27:03.572 "write": true, 00:27:03.572 "unmap": true, 00:27:03.572 "flush": true, 00:27:03.572 "reset": true, 00:27:03.572 "nvme_admin": false, 00:27:03.572 "nvme_io": false, 00:27:03.572 "nvme_io_md": false, 00:27:03.572 "write_zeroes": true, 00:27:03.572 "zcopy": true, 00:27:03.572 "get_zone_info": false, 00:27:03.572 "zone_management": false, 00:27:03.572 "zone_append": false, 00:27:03.572 "compare": false, 00:27:03.572 "compare_and_write": false, 00:27:03.572 "abort": true, 00:27:03.572 "seek_hole": false, 00:27:03.572 "seek_data": false, 00:27:03.572 "copy": true, 00:27:03.572 "nvme_iov_md": false 00:27:03.572 }, 00:27:03.572 "memory_domains": [ 00:27:03.572 { 00:27:03.572 "dma_device_id": "system", 00:27:03.572 "dma_device_type": 1 00:27:03.572 }, 00:27:03.572 { 00:27:03.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:03.572 "dma_device_type": 2 00:27:03.572 } 00:27:03.572 ], 00:27:03.572 "driver_specific": {} 00:27:03.572 }' 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:03.572 16:03:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:03.833 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:04.093 [2024-07-12 16:03:24.351282] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:04.093 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.353 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:04.353 "name": "Existed_Raid", 00:27:04.353 "uuid": "7ae5e479-57ff-48e3-b532-70141adc0a9b", 00:27:04.353 "strip_size_kb": 0, 00:27:04.353 "state": "online", 00:27:04.353 "raid_level": "raid1", 00:27:04.353 "superblock": true, 00:27:04.353 "num_base_bdevs": 2, 00:27:04.353 "num_base_bdevs_discovered": 1, 00:27:04.353 "num_base_bdevs_operational": 1, 00:27:04.353 "base_bdevs_list": [ 00:27:04.353 { 00:27:04.353 "name": null, 00:27:04.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.353 "is_configured": false, 00:27:04.353 "data_offset": 256, 00:27:04.353 "data_size": 7936 00:27:04.353 }, 00:27:04.353 { 00:27:04.353 "name": "BaseBdev2", 00:27:04.353 "uuid": "b541294a-1131-4297-9117-bc6a033fe0ce", 00:27:04.353 "is_configured": true, 00:27:04.353 "data_offset": 256, 00:27:04.353 "data_size": 7936 00:27:04.353 } 00:27:04.353 ] 00:27:04.353 }' 00:27:04.353 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:04.353 16:03:24 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:04.922 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:05.182 [2024-07-12 16:03:25.492080] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:05.182 [2024-07-12 16:03:25.492142] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:05.182 [2024-07-12 16:03:25.498618] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:05.182 [2024-07-12 16:03:25.498644] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:05.182 [2024-07-12 16:03:25.498650] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a42090 name Existed_Raid, state offline 00:27:05.182 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:05.182 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:05.182 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.182 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2668399 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2668399 ']' 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2668399 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2668399 00:27:05.442 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2668399' 00:27:05.443 killing process with pid 2668399 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2668399 00:27:05.443 [2024-07-12 16:03:25.754461] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2668399 00:27:05.443 [2024-07-12 16:03:25.755052] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:27:05.443 00:27:05.443 real 0m9.034s 00:27:05.443 user 0m16.422s 00:27:05.443 sys 0m1.368s 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:05.443 16:03:25 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:05.443 ************************************ 00:27:05.443 END TEST raid_state_function_test_sb_md_separate 00:27:05.443 ************************************ 00:27:05.703 16:03:25 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:05.703 16:03:25 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:27:05.703 16:03:25 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:05.703 16:03:25 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:05.703 16:03:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:05.703 ************************************ 00:27:05.703 START TEST raid_superblock_test_md_separate 00:27:05.703 ************************************ 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2670095 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2670095 /var/tmp/spdk-raid.sock 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2670095 ']' 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:05.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:05.703 16:03:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:05.703 [2024-07-12 16:03:26.011351] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:27:05.703 [2024-07-12 16:03:26.011409] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670095 ] 00:27:05.703 [2024-07-12 16:03:26.104702] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:05.962 [2024-07-12 16:03:26.180994] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:05.962 [2024-07-12 16:03:26.231726] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:05.962 [2024-07-12 16:03:26.231752] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:06.531 16:03:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:27:06.791 malloc1 00:27:06.791 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:07.050 [2024-07-12 16:03:27.247478] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:07.050 [2024-07-12 16:03:27.247511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:07.050 [2024-07-12 16:03:27.247524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeab540 00:27:07.050 [2024-07-12 16:03:27.247530] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:07.050 [2024-07-12 16:03:27.248694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:07.050 [2024-07-12 16:03:27.248718] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:07.050 pt1 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:27:07.050 malloc2 00:27:07.050 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:07.309 [2024-07-12 16:03:27.630933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:07.309 [2024-07-12 16:03:27.630961] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:07.309 [2024-07-12 16:03:27.630970] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbe2a0 00:27:07.309 [2024-07-12 16:03:27.630976] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:07.309 [2024-07-12 16:03:27.632009] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:07.309 [2024-07-12 16:03:27.632027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:07.309 pt2 00:27:07.309 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:07.309 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:07.309 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:07.568 [2024-07-12 16:03:27.827441] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:07.568 [2024-07-12 16:03:27.828411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:07.568 [2024-07-12 16:03:27.828522] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfbee90 00:27:07.568 [2024-07-12 16:03:27.828531] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:07.568 [2024-07-12 16:03:27.828576] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc0ff0 00:27:07.568 [2024-07-12 16:03:27.828662] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfbee90 00:27:07.568 [2024-07-12 16:03:27.828668] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfbee90 00:27:07.568 [2024-07-12 16:03:27.828722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.568 16:03:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.828 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:07.828 "name": "raid_bdev1", 00:27:07.828 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:07.828 "strip_size_kb": 0, 00:27:07.828 "state": "online", 00:27:07.828 "raid_level": "raid1", 00:27:07.828 "superblock": true, 00:27:07.828 "num_base_bdevs": 2, 00:27:07.828 "num_base_bdevs_discovered": 2, 00:27:07.828 "num_base_bdevs_operational": 2, 00:27:07.828 "base_bdevs_list": [ 00:27:07.828 { 00:27:07.828 "name": "pt1", 00:27:07.828 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:07.828 "is_configured": true, 00:27:07.828 "data_offset": 256, 00:27:07.828 "data_size": 7936 00:27:07.828 }, 00:27:07.828 { 00:27:07.828 "name": "pt2", 00:27:07.828 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:07.828 "is_configured": true, 00:27:07.828 "data_offset": 256, 00:27:07.828 "data_size": 7936 00:27:07.828 } 00:27:07.828 ] 00:27:07.828 }' 00:27:07.828 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:07.828 16:03:28 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:08.397 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:08.398 [2024-07-12 16:03:28.713875] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:08.398 "name": "raid_bdev1", 00:27:08.398 "aliases": [ 00:27:08.398 "1c2c7b90-d129-4c91-8ecc-010adb84be01" 00:27:08.398 ], 00:27:08.398 "product_name": "Raid Volume", 00:27:08.398 "block_size": 4096, 00:27:08.398 "num_blocks": 7936, 00:27:08.398 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:08.398 "md_size": 32, 00:27:08.398 "md_interleave": false, 00:27:08.398 "dif_type": 0, 00:27:08.398 "assigned_rate_limits": { 00:27:08.398 "rw_ios_per_sec": 0, 00:27:08.398 "rw_mbytes_per_sec": 0, 00:27:08.398 "r_mbytes_per_sec": 0, 00:27:08.398 "w_mbytes_per_sec": 0 00:27:08.398 }, 00:27:08.398 "claimed": false, 00:27:08.398 "zoned": false, 00:27:08.398 "supported_io_types": { 00:27:08.398 "read": true, 00:27:08.398 "write": true, 00:27:08.398 "unmap": false, 00:27:08.398 "flush": false, 00:27:08.398 "reset": true, 00:27:08.398 "nvme_admin": false, 00:27:08.398 "nvme_io": false, 00:27:08.398 "nvme_io_md": false, 00:27:08.398 "write_zeroes": true, 00:27:08.398 "zcopy": false, 00:27:08.398 "get_zone_info": false, 00:27:08.398 "zone_management": false, 00:27:08.398 "zone_append": false, 00:27:08.398 "compare": false, 00:27:08.398 "compare_and_write": false, 00:27:08.398 "abort": false, 00:27:08.398 "seek_hole": false, 00:27:08.398 "seek_data": false, 00:27:08.398 "copy": false, 00:27:08.398 "nvme_iov_md": false 00:27:08.398 }, 00:27:08.398 "memory_domains": [ 00:27:08.398 { 00:27:08.398 "dma_device_id": "system", 00:27:08.398 "dma_device_type": 1 00:27:08.398 }, 00:27:08.398 { 00:27:08.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:08.398 "dma_device_type": 2 00:27:08.398 }, 00:27:08.398 { 00:27:08.398 "dma_device_id": "system", 00:27:08.398 "dma_device_type": 1 00:27:08.398 }, 00:27:08.398 { 00:27:08.398 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:08.398 "dma_device_type": 2 00:27:08.398 } 00:27:08.398 ], 00:27:08.398 "driver_specific": { 00:27:08.398 "raid": { 00:27:08.398 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:08.398 "strip_size_kb": 0, 00:27:08.398 "state": "online", 00:27:08.398 "raid_level": "raid1", 00:27:08.398 "superblock": true, 00:27:08.398 "num_base_bdevs": 2, 00:27:08.398 "num_base_bdevs_discovered": 2, 00:27:08.398 "num_base_bdevs_operational": 2, 00:27:08.398 "base_bdevs_list": [ 00:27:08.398 { 00:27:08.398 "name": "pt1", 00:27:08.398 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:08.398 "is_configured": true, 00:27:08.398 "data_offset": 256, 00:27:08.398 "data_size": 7936 00:27:08.398 }, 00:27:08.398 { 00:27:08.398 "name": "pt2", 00:27:08.398 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:08.398 "is_configured": true, 00:27:08.398 "data_offset": 256, 00:27:08.398 "data_size": 7936 00:27:08.398 } 00:27:08.398 ] 00:27:08.398 } 00:27:08.398 } 00:27:08.398 }' 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:08.398 pt2' 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:08.398 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:08.658 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:08.658 "name": "pt1", 00:27:08.658 "aliases": [ 00:27:08.658 "00000000-0000-0000-0000-000000000001" 00:27:08.658 ], 00:27:08.658 "product_name": "passthru", 00:27:08.658 "block_size": 4096, 00:27:08.658 "num_blocks": 8192, 00:27:08.658 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:08.658 "md_size": 32, 00:27:08.658 "md_interleave": false, 00:27:08.658 "dif_type": 0, 00:27:08.658 "assigned_rate_limits": { 00:27:08.658 "rw_ios_per_sec": 0, 00:27:08.658 "rw_mbytes_per_sec": 0, 00:27:08.658 "r_mbytes_per_sec": 0, 00:27:08.658 "w_mbytes_per_sec": 0 00:27:08.658 }, 00:27:08.658 "claimed": true, 00:27:08.658 "claim_type": "exclusive_write", 00:27:08.658 "zoned": false, 00:27:08.658 "supported_io_types": { 00:27:08.658 "read": true, 00:27:08.658 "write": true, 00:27:08.658 "unmap": true, 00:27:08.658 "flush": true, 00:27:08.658 "reset": true, 00:27:08.658 "nvme_admin": false, 00:27:08.658 "nvme_io": false, 00:27:08.658 "nvme_io_md": false, 00:27:08.658 "write_zeroes": true, 00:27:08.658 "zcopy": true, 00:27:08.658 "get_zone_info": false, 00:27:08.658 "zone_management": false, 00:27:08.658 "zone_append": false, 00:27:08.658 "compare": false, 00:27:08.658 "compare_and_write": false, 00:27:08.658 "abort": true, 00:27:08.658 "seek_hole": false, 00:27:08.658 "seek_data": false, 00:27:08.658 "copy": true, 00:27:08.658 "nvme_iov_md": false 00:27:08.658 }, 00:27:08.658 "memory_domains": [ 00:27:08.658 { 00:27:08.658 "dma_device_id": "system", 00:27:08.658 "dma_device_type": 1 00:27:08.658 }, 00:27:08.658 { 00:27:08.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:08.658 "dma_device_type": 2 00:27:08.658 } 00:27:08.658 ], 00:27:08.658 "driver_specific": { 00:27:08.658 "passthru": { 00:27:08.658 "name": "pt1", 00:27:08.658 "base_bdev_name": "malloc1" 00:27:08.658 } 00:27:08.658 } 00:27:08.658 }' 00:27:08.658 16:03:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:08.658 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:08.658 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:08.658 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:08.918 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:09.179 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:09.179 "name": "pt2", 00:27:09.179 "aliases": [ 00:27:09.179 "00000000-0000-0000-0000-000000000002" 00:27:09.179 ], 00:27:09.179 "product_name": "passthru", 00:27:09.179 "block_size": 4096, 00:27:09.179 "num_blocks": 8192, 00:27:09.179 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:09.179 "md_size": 32, 00:27:09.179 "md_interleave": false, 00:27:09.179 "dif_type": 0, 00:27:09.179 "assigned_rate_limits": { 00:27:09.179 "rw_ios_per_sec": 0, 00:27:09.179 "rw_mbytes_per_sec": 0, 00:27:09.179 "r_mbytes_per_sec": 0, 00:27:09.179 "w_mbytes_per_sec": 0 00:27:09.179 }, 00:27:09.179 "claimed": true, 00:27:09.179 "claim_type": "exclusive_write", 00:27:09.179 "zoned": false, 00:27:09.179 "supported_io_types": { 00:27:09.179 "read": true, 00:27:09.179 "write": true, 00:27:09.179 "unmap": true, 00:27:09.179 "flush": true, 00:27:09.179 "reset": true, 00:27:09.179 "nvme_admin": false, 00:27:09.179 "nvme_io": false, 00:27:09.179 "nvme_io_md": false, 00:27:09.179 "write_zeroes": true, 00:27:09.179 "zcopy": true, 00:27:09.179 "get_zone_info": false, 00:27:09.179 "zone_management": false, 00:27:09.179 "zone_append": false, 00:27:09.179 "compare": false, 00:27:09.179 "compare_and_write": false, 00:27:09.179 "abort": true, 00:27:09.179 "seek_hole": false, 00:27:09.179 "seek_data": false, 00:27:09.179 "copy": true, 00:27:09.179 "nvme_iov_md": false 00:27:09.179 }, 00:27:09.179 "memory_domains": [ 00:27:09.179 { 00:27:09.179 "dma_device_id": "system", 00:27:09.179 "dma_device_type": 1 00:27:09.179 }, 00:27:09.179 { 00:27:09.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:09.179 "dma_device_type": 2 00:27:09.179 } 00:27:09.179 ], 00:27:09.179 "driver_specific": { 00:27:09.179 "passthru": { 00:27:09.179 "name": "pt2", 00:27:09.179 "base_bdev_name": "malloc2" 00:27:09.179 } 00:27:09.179 } 00:27:09.179 }' 00:27:09.179 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:09.179 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:09.179 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:09.179 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:09.439 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:09.699 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:09.699 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:09.699 16:03:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:09.699 [2024-07-12 16:03:30.069306] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:09.699 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1c2c7b90-d129-4c91-8ecc-010adb84be01 00:27:09.699 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 1c2c7b90-d129-4c91-8ecc-010adb84be01 ']' 00:27:09.699 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:09.959 [2024-07-12 16:03:30.261578] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:09.959 [2024-07-12 16:03:30.261591] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:09.959 [2024-07-12 16:03:30.261630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:09.959 [2024-07-12 16:03:30.261671] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:09.959 [2024-07-12 16:03:30.261677] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbee90 name raid_bdev1, state offline 00:27:09.959 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.960 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:10.220 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:10.480 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:10.480 16:03:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:10.740 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:10.741 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:11.001 [2024-07-12 16:03:31.195917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:11.001 [2024-07-12 16:03:31.196982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:11.001 [2024-07-12 16:03:31.197024] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:11.001 [2024-07-12 16:03:31.197053] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:11.001 [2024-07-12 16:03:31.197063] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:11.001 [2024-07-12 16:03:31.197069] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfbfd10 name raid_bdev1, state configuring 00:27:11.001 request: 00:27:11.001 { 00:27:11.002 "name": "raid_bdev1", 00:27:11.002 "raid_level": "raid1", 00:27:11.002 "base_bdevs": [ 00:27:11.002 "malloc1", 00:27:11.002 "malloc2" 00:27:11.002 ], 00:27:11.002 "superblock": false, 00:27:11.002 "method": "bdev_raid_create", 00:27:11.002 "req_id": 1 00:27:11.002 } 00:27:11.002 Got JSON-RPC error response 00:27:11.002 response: 00:27:11.002 { 00:27:11.002 "code": -17, 00:27:11.002 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:11.002 } 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:11.002 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:11.262 [2024-07-12 16:03:31.628960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:11.262 [2024-07-12 16:03:31.628982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:11.262 [2024-07-12 16:03:31.628992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc0cf0 00:27:11.262 [2024-07-12 16:03:31.628998] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:11.262 [2024-07-12 16:03:31.630122] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:11.262 [2024-07-12 16:03:31.630140] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:11.263 [2024-07-12 16:03:31.630169] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:11.263 [2024-07-12 16:03:31.630187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:11.263 pt1 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.263 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.523 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:11.523 "name": "raid_bdev1", 00:27:11.523 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:11.523 "strip_size_kb": 0, 00:27:11.523 "state": "configuring", 00:27:11.523 "raid_level": "raid1", 00:27:11.523 "superblock": true, 00:27:11.523 "num_base_bdevs": 2, 00:27:11.523 "num_base_bdevs_discovered": 1, 00:27:11.523 "num_base_bdevs_operational": 2, 00:27:11.523 "base_bdevs_list": [ 00:27:11.523 { 00:27:11.523 "name": "pt1", 00:27:11.523 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:11.523 "is_configured": true, 00:27:11.523 "data_offset": 256, 00:27:11.523 "data_size": 7936 00:27:11.523 }, 00:27:11.523 { 00:27:11.523 "name": null, 00:27:11.523 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:11.523 "is_configured": false, 00:27:11.523 "data_offset": 256, 00:27:11.523 "data_size": 7936 00:27:11.523 } 00:27:11.523 ] 00:27:11.523 }' 00:27:11.523 16:03:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:11.523 16:03:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:12.091 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:12.091 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:12.091 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:12.091 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:12.351 [2024-07-12 16:03:32.551300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:12.351 [2024-07-12 16:03:32.551326] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:12.351 [2024-07-12 16:03:32.551336] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeab770 00:27:12.351 [2024-07-12 16:03:32.551342] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:12.351 [2024-07-12 16:03:32.551480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:12.351 [2024-07-12 16:03:32.551489] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:12.351 [2024-07-12 16:03:32.551514] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:12.351 [2024-07-12 16:03:32.551525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:12.351 [2024-07-12 16:03:32.551598] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc12e0 00:27:12.351 [2024-07-12 16:03:32.551604] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:12.351 [2024-07-12 16:03:32.551646] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc5a00 00:27:12.351 [2024-07-12 16:03:32.551730] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc12e0 00:27:12.351 [2024-07-12 16:03:32.551736] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc12e0 00:27:12.351 [2024-07-12 16:03:32.551787] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:12.351 pt2 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.351 "name": "raid_bdev1", 00:27:12.351 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:12.351 "strip_size_kb": 0, 00:27:12.351 "state": "online", 00:27:12.351 "raid_level": "raid1", 00:27:12.351 "superblock": true, 00:27:12.351 "num_base_bdevs": 2, 00:27:12.351 "num_base_bdevs_discovered": 2, 00:27:12.351 "num_base_bdevs_operational": 2, 00:27:12.351 "base_bdevs_list": [ 00:27:12.351 { 00:27:12.351 "name": "pt1", 00:27:12.351 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:12.351 "is_configured": true, 00:27:12.351 "data_offset": 256, 00:27:12.351 "data_size": 7936 00:27:12.351 }, 00:27:12.351 { 00:27:12.351 "name": "pt2", 00:27:12.351 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:12.351 "is_configured": true, 00:27:12.351 "data_offset": 256, 00:27:12.351 "data_size": 7936 00:27:12.351 } 00:27:12.351 ] 00:27:12.351 }' 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.351 16:03:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:12.933 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:13.193 [2024-07-12 16:03:33.489902] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:13.193 "name": "raid_bdev1", 00:27:13.193 "aliases": [ 00:27:13.193 "1c2c7b90-d129-4c91-8ecc-010adb84be01" 00:27:13.193 ], 00:27:13.193 "product_name": "Raid Volume", 00:27:13.193 "block_size": 4096, 00:27:13.193 "num_blocks": 7936, 00:27:13.193 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:13.193 "md_size": 32, 00:27:13.193 "md_interleave": false, 00:27:13.193 "dif_type": 0, 00:27:13.193 "assigned_rate_limits": { 00:27:13.193 "rw_ios_per_sec": 0, 00:27:13.193 "rw_mbytes_per_sec": 0, 00:27:13.193 "r_mbytes_per_sec": 0, 00:27:13.193 "w_mbytes_per_sec": 0 00:27:13.193 }, 00:27:13.193 "claimed": false, 00:27:13.193 "zoned": false, 00:27:13.193 "supported_io_types": { 00:27:13.193 "read": true, 00:27:13.193 "write": true, 00:27:13.193 "unmap": false, 00:27:13.193 "flush": false, 00:27:13.193 "reset": true, 00:27:13.193 "nvme_admin": false, 00:27:13.193 "nvme_io": false, 00:27:13.193 "nvme_io_md": false, 00:27:13.193 "write_zeroes": true, 00:27:13.193 "zcopy": false, 00:27:13.193 "get_zone_info": false, 00:27:13.193 "zone_management": false, 00:27:13.193 "zone_append": false, 00:27:13.193 "compare": false, 00:27:13.193 "compare_and_write": false, 00:27:13.193 "abort": false, 00:27:13.193 "seek_hole": false, 00:27:13.193 "seek_data": false, 00:27:13.193 "copy": false, 00:27:13.193 "nvme_iov_md": false 00:27:13.193 }, 00:27:13.193 "memory_domains": [ 00:27:13.193 { 00:27:13.193 "dma_device_id": "system", 00:27:13.193 "dma_device_type": 1 00:27:13.193 }, 00:27:13.193 { 00:27:13.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.193 "dma_device_type": 2 00:27:13.193 }, 00:27:13.193 { 00:27:13.193 "dma_device_id": "system", 00:27:13.193 "dma_device_type": 1 00:27:13.193 }, 00:27:13.193 { 00:27:13.193 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.193 "dma_device_type": 2 00:27:13.193 } 00:27:13.193 ], 00:27:13.193 "driver_specific": { 00:27:13.193 "raid": { 00:27:13.193 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:13.193 "strip_size_kb": 0, 00:27:13.193 "state": "online", 00:27:13.193 "raid_level": "raid1", 00:27:13.193 "superblock": true, 00:27:13.193 "num_base_bdevs": 2, 00:27:13.193 "num_base_bdevs_discovered": 2, 00:27:13.193 "num_base_bdevs_operational": 2, 00:27:13.193 "base_bdevs_list": [ 00:27:13.193 { 00:27:13.193 "name": "pt1", 00:27:13.193 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:13.193 "is_configured": true, 00:27:13.193 "data_offset": 256, 00:27:13.193 "data_size": 7936 00:27:13.193 }, 00:27:13.193 { 00:27:13.193 "name": "pt2", 00:27:13.193 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.193 "is_configured": true, 00:27:13.193 "data_offset": 256, 00:27:13.193 "data_size": 7936 00:27:13.193 } 00:27:13.193 ] 00:27:13.193 } 00:27:13.193 } 00:27:13.193 }' 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:13.193 pt2' 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:13.193 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:13.452 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:13.452 "name": "pt1", 00:27:13.452 "aliases": [ 00:27:13.452 "00000000-0000-0000-0000-000000000001" 00:27:13.452 ], 00:27:13.452 "product_name": "passthru", 00:27:13.452 "block_size": 4096, 00:27:13.452 "num_blocks": 8192, 00:27:13.452 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:13.452 "md_size": 32, 00:27:13.452 "md_interleave": false, 00:27:13.452 "dif_type": 0, 00:27:13.452 "assigned_rate_limits": { 00:27:13.452 "rw_ios_per_sec": 0, 00:27:13.452 "rw_mbytes_per_sec": 0, 00:27:13.452 "r_mbytes_per_sec": 0, 00:27:13.452 "w_mbytes_per_sec": 0 00:27:13.452 }, 00:27:13.452 "claimed": true, 00:27:13.452 "claim_type": "exclusive_write", 00:27:13.452 "zoned": false, 00:27:13.452 "supported_io_types": { 00:27:13.452 "read": true, 00:27:13.452 "write": true, 00:27:13.452 "unmap": true, 00:27:13.452 "flush": true, 00:27:13.452 "reset": true, 00:27:13.452 "nvme_admin": false, 00:27:13.452 "nvme_io": false, 00:27:13.453 "nvme_io_md": false, 00:27:13.453 "write_zeroes": true, 00:27:13.453 "zcopy": true, 00:27:13.453 "get_zone_info": false, 00:27:13.453 "zone_management": false, 00:27:13.453 "zone_append": false, 00:27:13.453 "compare": false, 00:27:13.453 "compare_and_write": false, 00:27:13.453 "abort": true, 00:27:13.453 "seek_hole": false, 00:27:13.453 "seek_data": false, 00:27:13.453 "copy": true, 00:27:13.453 "nvme_iov_md": false 00:27:13.453 }, 00:27:13.453 "memory_domains": [ 00:27:13.453 { 00:27:13.453 "dma_device_id": "system", 00:27:13.453 "dma_device_type": 1 00:27:13.453 }, 00:27:13.453 { 00:27:13.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.453 "dma_device_type": 2 00:27:13.453 } 00:27:13.453 ], 00:27:13.453 "driver_specific": { 00:27:13.453 "passthru": { 00:27:13.453 "name": "pt1", 00:27:13.453 "base_bdev_name": "malloc1" 00:27:13.453 } 00:27:13.453 } 00:27:13.453 }' 00:27:13.453 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.453 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.453 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:13.453 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:13.453 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:13.712 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:13.712 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:13.712 16:03:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:13.712 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:13.972 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:13.972 "name": "pt2", 00:27:13.972 "aliases": [ 00:27:13.972 "00000000-0000-0000-0000-000000000002" 00:27:13.972 ], 00:27:13.972 "product_name": "passthru", 00:27:13.972 "block_size": 4096, 00:27:13.972 "num_blocks": 8192, 00:27:13.972 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:13.972 "md_size": 32, 00:27:13.972 "md_interleave": false, 00:27:13.972 "dif_type": 0, 00:27:13.972 "assigned_rate_limits": { 00:27:13.972 "rw_ios_per_sec": 0, 00:27:13.972 "rw_mbytes_per_sec": 0, 00:27:13.972 "r_mbytes_per_sec": 0, 00:27:13.972 "w_mbytes_per_sec": 0 00:27:13.972 }, 00:27:13.972 "claimed": true, 00:27:13.972 "claim_type": "exclusive_write", 00:27:13.972 "zoned": false, 00:27:13.972 "supported_io_types": { 00:27:13.972 "read": true, 00:27:13.972 "write": true, 00:27:13.972 "unmap": true, 00:27:13.972 "flush": true, 00:27:13.972 "reset": true, 00:27:13.972 "nvme_admin": false, 00:27:13.972 "nvme_io": false, 00:27:13.972 "nvme_io_md": false, 00:27:13.972 "write_zeroes": true, 00:27:13.972 "zcopy": true, 00:27:13.972 "get_zone_info": false, 00:27:13.972 "zone_management": false, 00:27:13.972 "zone_append": false, 00:27:13.972 "compare": false, 00:27:13.972 "compare_and_write": false, 00:27:13.972 "abort": true, 00:27:13.972 "seek_hole": false, 00:27:13.972 "seek_data": false, 00:27:13.972 "copy": true, 00:27:13.972 "nvme_iov_md": false 00:27:13.972 }, 00:27:13.972 "memory_domains": [ 00:27:13.972 { 00:27:13.972 "dma_device_id": "system", 00:27:13.972 "dma_device_type": 1 00:27:13.972 }, 00:27:13.972 { 00:27:13.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.973 "dma_device_type": 2 00:27:13.973 } 00:27:13.973 ], 00:27:13.973 "driver_specific": { 00:27:13.973 "passthru": { 00:27:13.973 "name": "pt2", 00:27:13.973 "base_bdev_name": "malloc2" 00:27:13.973 } 00:27:13.973 } 00:27:13.973 }' 00:27:13.973 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.973 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:13.973 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:27:13.973 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:14.232 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:14.493 [2024-07-12 16:03:34.801189] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:14.493 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 1c2c7b90-d129-4c91-8ecc-010adb84be01 '!=' 1c2c7b90-d129-4c91-8ecc-010adb84be01 ']' 00:27:14.493 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:14.493 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:14.493 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:27:14.493 16:03:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:15.064 [2024-07-12 16:03:35.326347] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.064 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.324 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.324 "name": "raid_bdev1", 00:27:15.324 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:15.324 "strip_size_kb": 0, 00:27:15.324 "state": "online", 00:27:15.324 "raid_level": "raid1", 00:27:15.324 "superblock": true, 00:27:15.324 "num_base_bdevs": 2, 00:27:15.324 "num_base_bdevs_discovered": 1, 00:27:15.324 "num_base_bdevs_operational": 1, 00:27:15.324 "base_bdevs_list": [ 00:27:15.324 { 00:27:15.324 "name": null, 00:27:15.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.324 "is_configured": false, 00:27:15.324 "data_offset": 256, 00:27:15.324 "data_size": 7936 00:27:15.324 }, 00:27:15.324 { 00:27:15.324 "name": "pt2", 00:27:15.324 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:15.324 "is_configured": true, 00:27:15.324 "data_offset": 256, 00:27:15.324 "data_size": 7936 00:27:15.324 } 00:27:15.324 ] 00:27:15.324 }' 00:27:15.324 16:03:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.324 16:03:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:15.894 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:15.894 [2024-07-12 16:03:36.268707] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:15.894 [2024-07-12 16:03:36.268725] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:15.894 [2024-07-12 16:03:36.268755] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:15.894 [2024-07-12 16:03:36.268782] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:15.894 [2024-07-12 16:03:36.268788] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc12e0 name raid_bdev1, state offline 00:27:15.894 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.894 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:16.154 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:16.154 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:16.154 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:16.154 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:16.154 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:27:16.415 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:16.415 [2024-07-12 16:03:36.846150] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:16.415 [2024-07-12 16:03:36.846174] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.415 [2024-07-12 16:03:36.846184] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc5610 00:27:16.415 [2024-07-12 16:03:36.846190] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.415 [2024-07-12 16:03:36.847324] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.415 [2024-07-12 16:03:36.847341] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:16.415 [2024-07-12 16:03:36.847371] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:16.415 [2024-07-12 16:03:36.847388] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:16.415 [2024-07-12 16:03:36.847444] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfc1a40 00:27:16.415 [2024-07-12 16:03:36.847454] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:16.415 [2024-07-12 16:03:36.847495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeabb20 00:27:16.415 [2024-07-12 16:03:36.847569] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfc1a40 00:27:16.415 [2024-07-12 16:03:36.847574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xfc1a40 00:27:16.415 [2024-07-12 16:03:36.847620] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:16.415 pt2 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.793 16:03:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:16.793 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.793 "name": "raid_bdev1", 00:27:16.793 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:16.793 "strip_size_kb": 0, 00:27:16.793 "state": "online", 00:27:16.793 "raid_level": "raid1", 00:27:16.793 "superblock": true, 00:27:16.793 "num_base_bdevs": 2, 00:27:16.793 "num_base_bdevs_discovered": 1, 00:27:16.793 "num_base_bdevs_operational": 1, 00:27:16.793 "base_bdevs_list": [ 00:27:16.793 { 00:27:16.793 "name": null, 00:27:16.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.793 "is_configured": false, 00:27:16.793 "data_offset": 256, 00:27:16.793 "data_size": 7936 00:27:16.793 }, 00:27:16.793 { 00:27:16.793 "name": "pt2", 00:27:16.793 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:16.793 "is_configured": true, 00:27:16.793 "data_offset": 256, 00:27:16.793 "data_size": 7936 00:27:16.793 } 00:27:16.793 ] 00:27:16.793 }' 00:27:16.793 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.793 16:03:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:17.360 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:17.360 [2024-07-12 16:03:37.736397] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:17.360 [2024-07-12 16:03:37.736412] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:17.360 [2024-07-12 16:03:37.736444] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:17.360 [2024-07-12 16:03:37.736473] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:17.360 [2024-07-12 16:03:37.736478] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc1a40 name raid_bdev1, state offline 00:27:17.360 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:17.360 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.619 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:17.619 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:17.619 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:17.619 16:03:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:17.877 [2024-07-12 16:03:38.121367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:17.877 [2024-07-12 16:03:38.121394] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:17.877 [2024-07-12 16:03:38.121404] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbe4d0 00:27:17.877 [2024-07-12 16:03:38.121410] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:17.877 [2024-07-12 16:03:38.122536] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:17.878 [2024-07-12 16:03:38.122553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:17.878 [2024-07-12 16:03:38.122583] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:17.878 [2024-07-12 16:03:38.122600] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:17.878 [2024-07-12 16:03:38.122669] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:17.878 [2024-07-12 16:03:38.122676] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:17.878 [2024-07-12 16:03:38.122684] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfc4c10 name raid_bdev1, state configuring 00:27:17.878 [2024-07-12 16:03:38.122697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:17.878 [2024-07-12 16:03:38.122738] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe29b80 00:27:17.878 [2024-07-12 16:03:38.122744] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:17.878 [2024-07-12 16:03:38.122783] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfc26b0 00:27:17.878 [2024-07-12 16:03:38.122858] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe29b80 00:27:17.878 [2024-07-12 16:03:38.122863] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe29b80 00:27:17.878 [2024-07-12 16:03:38.122915] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:17.878 pt1 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.878 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.137 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.137 "name": "raid_bdev1", 00:27:18.137 "uuid": "1c2c7b90-d129-4c91-8ecc-010adb84be01", 00:27:18.137 "strip_size_kb": 0, 00:27:18.137 "state": "online", 00:27:18.137 "raid_level": "raid1", 00:27:18.137 "superblock": true, 00:27:18.137 "num_base_bdevs": 2, 00:27:18.137 "num_base_bdevs_discovered": 1, 00:27:18.137 "num_base_bdevs_operational": 1, 00:27:18.137 "base_bdevs_list": [ 00:27:18.137 { 00:27:18.137 "name": null, 00:27:18.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.137 "is_configured": false, 00:27:18.137 "data_offset": 256, 00:27:18.137 "data_size": 7936 00:27:18.137 }, 00:27:18.137 { 00:27:18.137 "name": "pt2", 00:27:18.137 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:18.137 "is_configured": true, 00:27:18.137 "data_offset": 256, 00:27:18.137 "data_size": 7936 00:27:18.137 } 00:27:18.137 ] 00:27:18.137 }' 00:27:18.137 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.137 16:03:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:18.705 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:18.705 16:03:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:18.705 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:18.705 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:18.705 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:18.965 [2024-07-12 16:03:39.216327] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 1c2c7b90-d129-4c91-8ecc-010adb84be01 '!=' 1c2c7b90-d129-4c91-8ecc-010adb84be01 ']' 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2670095 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2670095 ']' 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 2670095 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2670095 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2670095' 00:27:18.965 killing process with pid 2670095 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 2670095 00:27:18.965 [2024-07-12 16:03:39.285917] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:18.965 [2024-07-12 16:03:39.285954] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:18.965 [2024-07-12 16:03:39.285983] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:18.965 [2024-07-12 16:03:39.285989] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe29b80 name raid_bdev1, state offline 00:27:18.965 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 2670095 00:27:18.965 [2024-07-12 16:03:39.298656] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:19.226 16:03:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:27:19.226 00:27:19.226 real 0m13.463s 00:27:19.226 user 0m24.939s 00:27:19.226 sys 0m2.043s 00:27:19.226 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:19.226 16:03:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.226 ************************************ 00:27:19.226 END TEST raid_superblock_test_md_separate 00:27:19.226 ************************************ 00:27:19.226 16:03:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:19.226 16:03:39 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:27:19.226 16:03:39 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:27:19.226 16:03:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:19.226 16:03:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:19.226 16:03:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:19.226 ************************************ 00:27:19.226 START TEST raid_rebuild_test_sb_md_separate 00:27:19.226 ************************************ 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:19.226 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2672623 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2672623 /var/tmp/spdk-raid.sock 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 2672623 ']' 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:19.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:19.227 16:03:39 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.227 [2024-07-12 16:03:39.568208] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:27:19.227 [2024-07-12 16:03:39.568261] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2672623 ] 00:27:19.227 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:19.227 Zero copy mechanism will not be used. 00:27:19.227 [2024-07-12 16:03:39.654701] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:19.488 [2024-07-12 16:03:39.719070] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:19.488 [2024-07-12 16:03:39.758776] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:19.488 [2024-07-12 16:03:39.758798] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:20.057 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:20.057 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:27:20.057 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:20.057 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:20.316 BaseBdev1_malloc 00:27:20.316 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:20.316 [2024-07-12 16:03:40.757265] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:20.316 [2024-07-12 16:03:40.757300] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.316 [2024-07-12 16:03:40.757314] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2849a00 00:27:20.316 [2024-07-12 16:03:40.757321] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.316 [2024-07-12 16:03:40.758532] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.316 [2024-07-12 16:03:40.758551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:20.316 BaseBdev1 00:27:20.576 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:20.576 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:20.576 BaseBdev2_malloc 00:27:20.576 16:03:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:20.835 [2024-07-12 16:03:41.144781] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:20.835 [2024-07-12 16:03:41.144811] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:20.835 [2024-07-12 16:03:41.144824] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x295c760 00:27:20.835 [2024-07-12 16:03:41.144830] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:20.835 [2024-07-12 16:03:41.145918] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:20.835 [2024-07-12 16:03:41.145936] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:20.835 BaseBdev2 00:27:20.835 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:21.094 spare_malloc 00:27:21.094 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:21.353 spare_delay 00:27:21.353 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:21.353 [2024-07-12 16:03:41.720486] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:21.353 [2024-07-12 16:03:41.720512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.353 [2024-07-12 16:03:41.720526] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x295f120 00:27:21.353 [2024-07-12 16:03:41.720532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.353 [2024-07-12 16:03:41.721589] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.353 [2024-07-12 16:03:41.721606] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:21.353 spare 00:27:21.353 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:21.612 [2024-07-12 16:03:41.908984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:21.612 [2024-07-12 16:03:41.909969] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:21.612 [2024-07-12 16:03:41.910085] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x29608d0 00:27:21.612 [2024-07-12 16:03:41.910092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:21.612 [2024-07-12 16:03:41.910139] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c95e0 00:27:21.612 [2024-07-12 16:03:41.910227] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x29608d0 00:27:21.612 [2024-07-12 16:03:41.910232] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x29608d0 00:27:21.612 [2024-07-12 16:03:41.910280] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.612 16:03:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.879 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.879 "name": "raid_bdev1", 00:27:21.879 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:21.879 "strip_size_kb": 0, 00:27:21.879 "state": "online", 00:27:21.879 "raid_level": "raid1", 00:27:21.879 "superblock": true, 00:27:21.879 "num_base_bdevs": 2, 00:27:21.879 "num_base_bdevs_discovered": 2, 00:27:21.879 "num_base_bdevs_operational": 2, 00:27:21.879 "base_bdevs_list": [ 00:27:21.879 { 00:27:21.879 "name": "BaseBdev1", 00:27:21.879 "uuid": "b15caace-2135-5cef-8c12-239aae7e0c33", 00:27:21.879 "is_configured": true, 00:27:21.879 "data_offset": 256, 00:27:21.879 "data_size": 7936 00:27:21.879 }, 00:27:21.879 { 00:27:21.879 "name": "BaseBdev2", 00:27:21.879 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:21.879 "is_configured": true, 00:27:21.879 "data_offset": 256, 00:27:21.879 "data_size": 7936 00:27:21.879 } 00:27:21.879 ] 00:27:21.879 }' 00:27:21.879 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.879 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:22.447 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.447 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:22.447 [2024-07-12 16:03:42.791396] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.448 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:22.448 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.448 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:22.706 16:03:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:22.706 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:22.706 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.706 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:22.965 [2024-07-12 16:03:43.176195] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c95e0 00:27:22.965 /dev/nbd0 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:22.965 1+0 records in 00:27:22.965 1+0 records out 00:27:22.965 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270965 s, 15.1 MB/s 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:27:22.965 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:23.533 7936+0 records in 00:27:23.533 7936+0 records out 00:27:23.533 32505856 bytes (33 MB, 31 MiB) copied, 0.60991 s, 53.3 MB/s 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:23.533 16:03:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:23.791 [2024-07-12 16:03:44.037295] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:23.791 [2024-07-12 16:03:44.215243] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.791 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.051 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:24.051 "name": "raid_bdev1", 00:27:24.051 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:24.051 "strip_size_kb": 0, 00:27:24.051 "state": "online", 00:27:24.051 "raid_level": "raid1", 00:27:24.051 "superblock": true, 00:27:24.051 "num_base_bdevs": 2, 00:27:24.051 "num_base_bdevs_discovered": 1, 00:27:24.051 "num_base_bdevs_operational": 1, 00:27:24.051 "base_bdevs_list": [ 00:27:24.051 { 00:27:24.051 "name": null, 00:27:24.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.051 "is_configured": false, 00:27:24.051 "data_offset": 256, 00:27:24.051 "data_size": 7936 00:27:24.051 }, 00:27:24.051 { 00:27:24.051 "name": "BaseBdev2", 00:27:24.051 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:24.051 "is_configured": true, 00:27:24.051 "data_offset": 256, 00:27:24.051 "data_size": 7936 00:27:24.051 } 00:27:24.051 ] 00:27:24.051 }' 00:27:24.051 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:24.051 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:24.619 16:03:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:24.878 [2024-07-12 16:03:45.149610] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:24.878 [2024-07-12 16:03:45.151262] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2960780 00:27:24.878 [2024-07-12 16:03:45.152787] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:24.878 16:03:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.817 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:26.076 "name": "raid_bdev1", 00:27:26.076 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:26.076 "strip_size_kb": 0, 00:27:26.076 "state": "online", 00:27:26.076 "raid_level": "raid1", 00:27:26.076 "superblock": true, 00:27:26.076 "num_base_bdevs": 2, 00:27:26.076 "num_base_bdevs_discovered": 2, 00:27:26.076 "num_base_bdevs_operational": 2, 00:27:26.076 "process": { 00:27:26.076 "type": "rebuild", 00:27:26.076 "target": "spare", 00:27:26.076 "progress": { 00:27:26.076 "blocks": 2816, 00:27:26.076 "percent": 35 00:27:26.076 } 00:27:26.076 }, 00:27:26.076 "base_bdevs_list": [ 00:27:26.076 { 00:27:26.076 "name": "spare", 00:27:26.076 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:26.076 "is_configured": true, 00:27:26.076 "data_offset": 256, 00:27:26.076 "data_size": 7936 00:27:26.076 }, 00:27:26.076 { 00:27:26.076 "name": "BaseBdev2", 00:27:26.076 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:26.076 "is_configured": true, 00:27:26.076 "data_offset": 256, 00:27:26.076 "data_size": 7936 00:27:26.076 } 00:27:26.076 ] 00:27:26.076 }' 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:26.076 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:26.336 [2024-07-12 16:03:46.609892] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.336 [2024-07-12 16:03:46.661749] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:26.336 [2024-07-12 16:03:46.661780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:26.336 [2024-07-12 16:03:46.661790] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:26.336 [2024-07-12 16:03:46.661794] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.336 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.595 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.595 "name": "raid_bdev1", 00:27:26.595 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:26.595 "strip_size_kb": 0, 00:27:26.595 "state": "online", 00:27:26.595 "raid_level": "raid1", 00:27:26.595 "superblock": true, 00:27:26.595 "num_base_bdevs": 2, 00:27:26.595 "num_base_bdevs_discovered": 1, 00:27:26.595 "num_base_bdevs_operational": 1, 00:27:26.595 "base_bdevs_list": [ 00:27:26.595 { 00:27:26.595 "name": null, 00:27:26.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.595 "is_configured": false, 00:27:26.595 "data_offset": 256, 00:27:26.595 "data_size": 7936 00:27:26.595 }, 00:27:26.595 { 00:27:26.595 "name": "BaseBdev2", 00:27:26.595 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:26.595 "is_configured": true, 00:27:26.595 "data_offset": 256, 00:27:26.595 "data_size": 7936 00:27:26.595 } 00:27:26.595 ] 00:27:26.595 }' 00:27:26.595 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.595 16:03:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:27.165 "name": "raid_bdev1", 00:27:27.165 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:27.165 "strip_size_kb": 0, 00:27:27.165 "state": "online", 00:27:27.165 "raid_level": "raid1", 00:27:27.165 "superblock": true, 00:27:27.165 "num_base_bdevs": 2, 00:27:27.165 "num_base_bdevs_discovered": 1, 00:27:27.165 "num_base_bdevs_operational": 1, 00:27:27.165 "base_bdevs_list": [ 00:27:27.165 { 00:27:27.165 "name": null, 00:27:27.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:27.165 "is_configured": false, 00:27:27.165 "data_offset": 256, 00:27:27.165 "data_size": 7936 00:27:27.165 }, 00:27:27.165 { 00:27:27.165 "name": "BaseBdev2", 00:27:27.165 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:27.165 "is_configured": true, 00:27:27.165 "data_offset": 256, 00:27:27.165 "data_size": 7936 00:27:27.165 } 00:27:27.165 ] 00:27:27.165 }' 00:27:27.165 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.425 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:27.425 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.425 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:27.425 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:27.425 [2024-07-12 16:03:47.858785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:27.425 [2024-07-12 16:03:47.860402] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2964060 00:27:27.425 [2024-07-12 16:03:47.861524] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:27.685 16:03:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.623 16:03:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:28.884 "name": "raid_bdev1", 00:27:28.884 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:28.884 "strip_size_kb": 0, 00:27:28.884 "state": "online", 00:27:28.884 "raid_level": "raid1", 00:27:28.884 "superblock": true, 00:27:28.884 "num_base_bdevs": 2, 00:27:28.884 "num_base_bdevs_discovered": 2, 00:27:28.884 "num_base_bdevs_operational": 2, 00:27:28.884 "process": { 00:27:28.884 "type": "rebuild", 00:27:28.884 "target": "spare", 00:27:28.884 "progress": { 00:27:28.884 "blocks": 2816, 00:27:28.884 "percent": 35 00:27:28.884 } 00:27:28.884 }, 00:27:28.884 "base_bdevs_list": [ 00:27:28.884 { 00:27:28.884 "name": "spare", 00:27:28.884 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:28.884 "is_configured": true, 00:27:28.884 "data_offset": 256, 00:27:28.884 "data_size": 7936 00:27:28.884 }, 00:27:28.884 { 00:27:28.884 "name": "BaseBdev2", 00:27:28.884 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:28.884 "is_configured": true, 00:27:28.884 "data_offset": 256, 00:27:28.884 "data_size": 7936 00:27:28.884 } 00:27:28.884 ] 00:27:28.884 }' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:28.884 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=975 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.884 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.144 "name": "raid_bdev1", 00:27:29.144 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:29.144 "strip_size_kb": 0, 00:27:29.144 "state": "online", 00:27:29.144 "raid_level": "raid1", 00:27:29.144 "superblock": true, 00:27:29.144 "num_base_bdevs": 2, 00:27:29.144 "num_base_bdevs_discovered": 2, 00:27:29.144 "num_base_bdevs_operational": 2, 00:27:29.144 "process": { 00:27:29.144 "type": "rebuild", 00:27:29.144 "target": "spare", 00:27:29.144 "progress": { 00:27:29.144 "blocks": 3584, 00:27:29.144 "percent": 45 00:27:29.144 } 00:27:29.144 }, 00:27:29.144 "base_bdevs_list": [ 00:27:29.144 { 00:27:29.144 "name": "spare", 00:27:29.144 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:29.144 "is_configured": true, 00:27:29.144 "data_offset": 256, 00:27:29.144 "data_size": 7936 00:27:29.144 }, 00:27:29.144 { 00:27:29.144 "name": "BaseBdev2", 00:27:29.144 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:29.144 "is_configured": true, 00:27:29.144 "data_offset": 256, 00:27:29.144 "data_size": 7936 00:27:29.144 } 00:27:29.144 ] 00:27:29.144 }' 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:29.144 16:03:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.086 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:30.346 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:30.346 "name": "raid_bdev1", 00:27:30.346 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:30.346 "strip_size_kb": 0, 00:27:30.346 "state": "online", 00:27:30.346 "raid_level": "raid1", 00:27:30.346 "superblock": true, 00:27:30.346 "num_base_bdevs": 2, 00:27:30.346 "num_base_bdevs_discovered": 2, 00:27:30.346 "num_base_bdevs_operational": 2, 00:27:30.347 "process": { 00:27:30.347 "type": "rebuild", 00:27:30.347 "target": "spare", 00:27:30.347 "progress": { 00:27:30.347 "blocks": 6912, 00:27:30.347 "percent": 87 00:27:30.347 } 00:27:30.347 }, 00:27:30.347 "base_bdevs_list": [ 00:27:30.347 { 00:27:30.347 "name": "spare", 00:27:30.347 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:30.347 "is_configured": true, 00:27:30.347 "data_offset": 256, 00:27:30.347 "data_size": 7936 00:27:30.347 }, 00:27:30.347 { 00:27:30.347 "name": "BaseBdev2", 00:27:30.347 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:30.347 "is_configured": true, 00:27:30.347 "data_offset": 256, 00:27:30.347 "data_size": 7936 00:27:30.347 } 00:27:30.347 ] 00:27:30.347 }' 00:27:30.347 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:30.347 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:30.347 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:30.347 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:30.347 16:03:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:30.607 [2024-07-12 16:03:50.979832] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:30.607 [2024-07-12 16:03:50.979880] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:30.607 [2024-07-12 16:03:50.979943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.546 "name": "raid_bdev1", 00:27:31.546 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:31.546 "strip_size_kb": 0, 00:27:31.546 "state": "online", 00:27:31.546 "raid_level": "raid1", 00:27:31.546 "superblock": true, 00:27:31.546 "num_base_bdevs": 2, 00:27:31.546 "num_base_bdevs_discovered": 2, 00:27:31.546 "num_base_bdevs_operational": 2, 00:27:31.546 "base_bdevs_list": [ 00:27:31.546 { 00:27:31.546 "name": "spare", 00:27:31.546 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:31.546 "is_configured": true, 00:27:31.546 "data_offset": 256, 00:27:31.546 "data_size": 7936 00:27:31.546 }, 00:27:31.546 { 00:27:31.546 "name": "BaseBdev2", 00:27:31.546 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:31.546 "is_configured": true, 00:27:31.546 "data_offset": 256, 00:27:31.546 "data_size": 7936 00:27:31.546 } 00:27:31.546 ] 00:27:31.546 }' 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:31.546 16:03:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:31.806 "name": "raid_bdev1", 00:27:31.806 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:31.806 "strip_size_kb": 0, 00:27:31.806 "state": "online", 00:27:31.806 "raid_level": "raid1", 00:27:31.806 "superblock": true, 00:27:31.806 "num_base_bdevs": 2, 00:27:31.806 "num_base_bdevs_discovered": 2, 00:27:31.806 "num_base_bdevs_operational": 2, 00:27:31.806 "base_bdevs_list": [ 00:27:31.806 { 00:27:31.806 "name": "spare", 00:27:31.806 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:31.806 "is_configured": true, 00:27:31.806 "data_offset": 256, 00:27:31.806 "data_size": 7936 00:27:31.806 }, 00:27:31.806 { 00:27:31.806 "name": "BaseBdev2", 00:27:31.806 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:31.806 "is_configured": true, 00:27:31.806 "data_offset": 256, 00:27:31.806 "data_size": 7936 00:27:31.806 } 00:27:31.806 ] 00:27:31.806 }' 00:27:31.806 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.066 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.326 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.326 "name": "raid_bdev1", 00:27:32.326 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:32.326 "strip_size_kb": 0, 00:27:32.326 "state": "online", 00:27:32.326 "raid_level": "raid1", 00:27:32.326 "superblock": true, 00:27:32.326 "num_base_bdevs": 2, 00:27:32.326 "num_base_bdevs_discovered": 2, 00:27:32.326 "num_base_bdevs_operational": 2, 00:27:32.326 "base_bdevs_list": [ 00:27:32.326 { 00:27:32.326 "name": "spare", 00:27:32.326 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:32.326 "is_configured": true, 00:27:32.326 "data_offset": 256, 00:27:32.326 "data_size": 7936 00:27:32.326 }, 00:27:32.326 { 00:27:32.326 "name": "BaseBdev2", 00:27:32.326 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:32.326 "is_configured": true, 00:27:32.326 "data_offset": 256, 00:27:32.326 "data_size": 7936 00:27:32.326 } 00:27:32.326 ] 00:27:32.326 }' 00:27:32.326 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.326 16:03:52 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:32.902 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:32.902 [2024-07-12 16:03:53.221684] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.902 [2024-07-12 16:03:53.221702] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:32.902 [2024-07-12 16:03:53.221749] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.902 [2024-07-12 16:03:53.221790] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.902 [2024-07-12 16:03:53.221796] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x29608d0 name raid_bdev1, state offline 00:27:32.902 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:27:32.902 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:33.161 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:33.162 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:33.162 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.162 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:33.422 /dev/nbd0 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:33.422 1+0 records in 00:27:33.422 1+0 records out 00:27:33.422 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000292115 s, 14.0 MB/s 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.422 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:33.682 /dev/nbd1 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:33.682 1+0 records in 00:27:33.682 1+0 records out 00:27:33.682 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320079 s, 12.8 MB/s 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:33.682 16:03:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:33.964 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:34.266 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:34.528 [2024-07-12 16:03:54.747293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:34.528 [2024-07-12 16:03:54.747327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:34.528 [2024-07-12 16:03:54.747341] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2964000 00:27:34.528 [2024-07-12 16:03:54.747347] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:34.528 [2024-07-12 16:03:54.748525] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:34.528 [2024-07-12 16:03:54.748546] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:34.528 [2024-07-12 16:03:54.748593] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:34.528 [2024-07-12 16:03:54.748613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:34.528 [2024-07-12 16:03:54.748688] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:34.528 spare 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.528 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.528 [2024-07-12 16:03:54.848983] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2961260 00:27:34.528 [2024-07-12 16:03:54.848992] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:34.528 [2024-07-12 16:03:54.849042] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x295fd80 00:27:34.528 [2024-07-12 16:03:54.849135] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2961260 00:27:34.528 [2024-07-12 16:03:54.849141] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2961260 00:27:34.528 [2024-07-12 16:03:54.849194] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:34.788 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.788 "name": "raid_bdev1", 00:27:34.788 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:34.788 "strip_size_kb": 0, 00:27:34.788 "state": "online", 00:27:34.788 "raid_level": "raid1", 00:27:34.788 "superblock": true, 00:27:34.788 "num_base_bdevs": 2, 00:27:34.788 "num_base_bdevs_discovered": 2, 00:27:34.788 "num_base_bdevs_operational": 2, 00:27:34.788 "base_bdevs_list": [ 00:27:34.788 { 00:27:34.788 "name": "spare", 00:27:34.788 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:34.788 "is_configured": true, 00:27:34.788 "data_offset": 256, 00:27:34.788 "data_size": 7936 00:27:34.788 }, 00:27:34.788 { 00:27:34.788 "name": "BaseBdev2", 00:27:34.788 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:34.788 "is_configured": true, 00:27:34.788 "data_offset": 256, 00:27:34.788 "data_size": 7936 00:27:34.788 } 00:27:34.788 ] 00:27:34.788 }' 00:27:34.788 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.788 16:03:54 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:35.048 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:35.048 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:35.048 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:35.048 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:35.048 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:35.049 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.049 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.309 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:35.309 "name": "raid_bdev1", 00:27:35.309 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:35.309 "strip_size_kb": 0, 00:27:35.309 "state": "online", 00:27:35.309 "raid_level": "raid1", 00:27:35.309 "superblock": true, 00:27:35.309 "num_base_bdevs": 2, 00:27:35.309 "num_base_bdevs_discovered": 2, 00:27:35.309 "num_base_bdevs_operational": 2, 00:27:35.309 "base_bdevs_list": [ 00:27:35.309 { 00:27:35.309 "name": "spare", 00:27:35.309 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:35.309 "is_configured": true, 00:27:35.309 "data_offset": 256, 00:27:35.309 "data_size": 7936 00:27:35.309 }, 00:27:35.309 { 00:27:35.309 "name": "BaseBdev2", 00:27:35.309 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:35.309 "is_configured": true, 00:27:35.309 "data_offset": 256, 00:27:35.309 "data_size": 7936 00:27:35.309 } 00:27:35.309 ] 00:27:35.309 }' 00:27:35.309 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:35.309 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:35.309 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:35.569 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:35.569 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.569 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:35.569 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:35.569 16:03:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:35.829 [2024-07-12 16:03:56.147010] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.829 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.089 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:36.089 "name": "raid_bdev1", 00:27:36.089 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:36.089 "strip_size_kb": 0, 00:27:36.089 "state": "online", 00:27:36.089 "raid_level": "raid1", 00:27:36.089 "superblock": true, 00:27:36.090 "num_base_bdevs": 2, 00:27:36.090 "num_base_bdevs_discovered": 1, 00:27:36.090 "num_base_bdevs_operational": 1, 00:27:36.090 "base_bdevs_list": [ 00:27:36.090 { 00:27:36.090 "name": null, 00:27:36.090 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:36.090 "is_configured": false, 00:27:36.090 "data_offset": 256, 00:27:36.090 "data_size": 7936 00:27:36.090 }, 00:27:36.090 { 00:27:36.090 "name": "BaseBdev2", 00:27:36.090 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:36.090 "is_configured": true, 00:27:36.090 "data_offset": 256, 00:27:36.090 "data_size": 7936 00:27:36.090 } 00:27:36.090 ] 00:27:36.090 }' 00:27:36.090 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:36.090 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:36.659 16:03:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:36.659 [2024-07-12 16:03:57.069357] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:36.659 [2024-07-12 16:03:57.069464] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:36.659 [2024-07-12 16:03:57.069473] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:36.659 [2024-07-12 16:03:57.069491] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:36.659 [2024-07-12 16:03:57.071058] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x29627a0 00:27:36.659 [2024-07-12 16:03:57.072178] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:36.659 16:03:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:38.039 "name": "raid_bdev1", 00:27:38.039 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:38.039 "strip_size_kb": 0, 00:27:38.039 "state": "online", 00:27:38.039 "raid_level": "raid1", 00:27:38.039 "superblock": true, 00:27:38.039 "num_base_bdevs": 2, 00:27:38.039 "num_base_bdevs_discovered": 2, 00:27:38.039 "num_base_bdevs_operational": 2, 00:27:38.039 "process": { 00:27:38.039 "type": "rebuild", 00:27:38.039 "target": "spare", 00:27:38.039 "progress": { 00:27:38.039 "blocks": 2816, 00:27:38.039 "percent": 35 00:27:38.039 } 00:27:38.039 }, 00:27:38.039 "base_bdevs_list": [ 00:27:38.039 { 00:27:38.039 "name": "spare", 00:27:38.039 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:38.039 "is_configured": true, 00:27:38.039 "data_offset": 256, 00:27:38.039 "data_size": 7936 00:27:38.039 }, 00:27:38.039 { 00:27:38.039 "name": "BaseBdev2", 00:27:38.039 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:38.039 "is_configured": true, 00:27:38.039 "data_offset": 256, 00:27:38.039 "data_size": 7936 00:27:38.039 } 00:27:38.039 ] 00:27:38.039 }' 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:38.039 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:38.299 [2024-07-12 16:03:58.549663] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:38.299 [2024-07-12 16:03:58.581132] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:38.299 [2024-07-12 16:03:58.581161] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:38.299 [2024-07-12 16:03:58.581175] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:38.299 [2024-07-12 16:03:58.581179] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.299 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.560 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:38.560 "name": "raid_bdev1", 00:27:38.560 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:38.560 "strip_size_kb": 0, 00:27:38.560 "state": "online", 00:27:38.560 "raid_level": "raid1", 00:27:38.560 "superblock": true, 00:27:38.560 "num_base_bdevs": 2, 00:27:38.560 "num_base_bdevs_discovered": 1, 00:27:38.560 "num_base_bdevs_operational": 1, 00:27:38.560 "base_bdevs_list": [ 00:27:38.560 { 00:27:38.560 "name": null, 00:27:38.560 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:38.560 "is_configured": false, 00:27:38.560 "data_offset": 256, 00:27:38.560 "data_size": 7936 00:27:38.560 }, 00:27:38.560 { 00:27:38.560 "name": "BaseBdev2", 00:27:38.560 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:38.560 "is_configured": true, 00:27:38.560 "data_offset": 256, 00:27:38.560 "data_size": 7936 00:27:38.560 } 00:27:38.560 ] 00:27:38.560 }' 00:27:38.560 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:38.560 16:03:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:39.133 16:03:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:39.133 [2024-07-12 16:03:59.537470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:39.133 [2024-07-12 16:03:59.537502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.133 [2024-07-12 16:03:59.537517] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28d20c0 00:27:39.133 [2024-07-12 16:03:59.537524] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.133 [2024-07-12 16:03:59.537694] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.133 [2024-07-12 16:03:59.537704] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:39.133 [2024-07-12 16:03:59.537749] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:39.133 [2024-07-12 16:03:59.537756] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:39.133 [2024-07-12 16:03:59.537761] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:39.133 [2024-07-12 16:03:59.537772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:39.133 [2024-07-12 16:03:59.539324] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c7e30 00:27:39.133 [2024-07-12 16:03:59.540406] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:39.133 spare 00:27:39.133 16:03:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:40.514 "name": "raid_bdev1", 00:27:40.514 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:40.514 "strip_size_kb": 0, 00:27:40.514 "state": "online", 00:27:40.514 "raid_level": "raid1", 00:27:40.514 "superblock": true, 00:27:40.514 "num_base_bdevs": 2, 00:27:40.514 "num_base_bdevs_discovered": 2, 00:27:40.514 "num_base_bdevs_operational": 2, 00:27:40.514 "process": { 00:27:40.514 "type": "rebuild", 00:27:40.514 "target": "spare", 00:27:40.514 "progress": { 00:27:40.514 "blocks": 2816, 00:27:40.514 "percent": 35 00:27:40.514 } 00:27:40.514 }, 00:27:40.514 "base_bdevs_list": [ 00:27:40.514 { 00:27:40.514 "name": "spare", 00:27:40.514 "uuid": "0c674afa-d144-5666-a2fb-3da43785e603", 00:27:40.514 "is_configured": true, 00:27:40.514 "data_offset": 256, 00:27:40.514 "data_size": 7936 00:27:40.514 }, 00:27:40.514 { 00:27:40.514 "name": "BaseBdev2", 00:27:40.514 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:40.514 "is_configured": true, 00:27:40.514 "data_offset": 256, 00:27:40.514 "data_size": 7936 00:27:40.514 } 00:27:40.514 ] 00:27:40.514 }' 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:40.514 16:04:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:40.774 [2024-07-12 16:04:01.022271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:40.774 [2024-07-12 16:04:01.049318] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:40.774 [2024-07-12 16:04:01.049347] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.774 [2024-07-12 16:04:01.049356] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:40.774 [2024-07-12 16:04:01.049361] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.774 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.034 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:41.034 "name": "raid_bdev1", 00:27:41.034 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:41.034 "strip_size_kb": 0, 00:27:41.034 "state": "online", 00:27:41.034 "raid_level": "raid1", 00:27:41.034 "superblock": true, 00:27:41.034 "num_base_bdevs": 2, 00:27:41.034 "num_base_bdevs_discovered": 1, 00:27:41.034 "num_base_bdevs_operational": 1, 00:27:41.034 "base_bdevs_list": [ 00:27:41.034 { 00:27:41.034 "name": null, 00:27:41.034 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.034 "is_configured": false, 00:27:41.034 "data_offset": 256, 00:27:41.034 "data_size": 7936 00:27:41.034 }, 00:27:41.034 { 00:27:41.034 "name": "BaseBdev2", 00:27:41.034 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:41.034 "is_configured": true, 00:27:41.034 "data_offset": 256, 00:27:41.034 "data_size": 7936 00:27:41.034 } 00:27:41.034 ] 00:27:41.034 }' 00:27:41.034 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:41.034 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.602 16:04:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.602 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.602 "name": "raid_bdev1", 00:27:41.602 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:41.602 "strip_size_kb": 0, 00:27:41.602 "state": "online", 00:27:41.602 "raid_level": "raid1", 00:27:41.602 "superblock": true, 00:27:41.602 "num_base_bdevs": 2, 00:27:41.602 "num_base_bdevs_discovered": 1, 00:27:41.602 "num_base_bdevs_operational": 1, 00:27:41.602 "base_bdevs_list": [ 00:27:41.602 { 00:27:41.602 "name": null, 00:27:41.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.602 "is_configured": false, 00:27:41.602 "data_offset": 256, 00:27:41.602 "data_size": 7936 00:27:41.602 }, 00:27:41.602 { 00:27:41.602 "name": "BaseBdev2", 00:27:41.602 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:41.602 "is_configured": true, 00:27:41.602 "data_offset": 256, 00:27:41.602 "data_size": 7936 00:27:41.602 } 00:27:41.602 ] 00:27:41.602 }' 00:27:41.602 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.862 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:41.862 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:41.862 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:41.862 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:41.862 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:42.121 [2024-07-12 16:04:02.450806] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:42.121 [2024-07-12 16:04:02.450834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:42.121 [2024-07-12 16:04:02.450850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2849c30 00:27:42.121 [2024-07-12 16:04:02.450856] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:42.121 [2024-07-12 16:04:02.450998] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:42.121 [2024-07-12 16:04:02.451007] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:42.121 [2024-07-12 16:04:02.451036] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:42.121 [2024-07-12 16:04:02.451043] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:42.121 [2024-07-12 16:04:02.451049] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:42.121 BaseBdev1 00:27:42.121 16:04:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.059 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.318 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:43.318 "name": "raid_bdev1", 00:27:43.318 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:43.318 "strip_size_kb": 0, 00:27:43.318 "state": "online", 00:27:43.318 "raid_level": "raid1", 00:27:43.318 "superblock": true, 00:27:43.318 "num_base_bdevs": 2, 00:27:43.318 "num_base_bdevs_discovered": 1, 00:27:43.318 "num_base_bdevs_operational": 1, 00:27:43.318 "base_bdevs_list": [ 00:27:43.318 { 00:27:43.318 "name": null, 00:27:43.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.319 "is_configured": false, 00:27:43.319 "data_offset": 256, 00:27:43.319 "data_size": 7936 00:27:43.319 }, 00:27:43.319 { 00:27:43.319 "name": "BaseBdev2", 00:27:43.319 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:43.319 "is_configured": true, 00:27:43.319 "data_offset": 256, 00:27:43.319 "data_size": 7936 00:27:43.319 } 00:27:43.319 ] 00:27:43.319 }' 00:27:43.319 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:43.319 16:04:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.887 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.147 "name": "raid_bdev1", 00:27:44.147 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:44.147 "strip_size_kb": 0, 00:27:44.147 "state": "online", 00:27:44.147 "raid_level": "raid1", 00:27:44.147 "superblock": true, 00:27:44.147 "num_base_bdevs": 2, 00:27:44.147 "num_base_bdevs_discovered": 1, 00:27:44.147 "num_base_bdevs_operational": 1, 00:27:44.147 "base_bdevs_list": [ 00:27:44.147 { 00:27:44.147 "name": null, 00:27:44.147 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:44.147 "is_configured": false, 00:27:44.147 "data_offset": 256, 00:27:44.147 "data_size": 7936 00:27:44.147 }, 00:27:44.147 { 00:27:44.147 "name": "BaseBdev2", 00:27:44.147 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:44.147 "is_configured": true, 00:27:44.147 "data_offset": 256, 00:27:44.147 "data_size": 7936 00:27:44.147 } 00:27:44.147 ] 00:27:44.147 }' 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:44.147 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:44.407 [2024-07-12 16:04:04.688517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:44.407 [2024-07-12 16:04:04.688602] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:44.407 [2024-07-12 16:04:04.688610] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:44.407 request: 00:27:44.407 { 00:27:44.407 "base_bdev": "BaseBdev1", 00:27:44.407 "raid_bdev": "raid_bdev1", 00:27:44.407 "method": "bdev_raid_add_base_bdev", 00:27:44.407 "req_id": 1 00:27:44.407 } 00:27:44.407 Got JSON-RPC error response 00:27:44.407 response: 00:27:44.407 { 00:27:44.407 "code": -22, 00:27:44.407 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:44.407 } 00:27:44.407 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:27:44.407 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:44.407 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:44.407 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:44.407 16:04:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.359 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.621 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.621 "name": "raid_bdev1", 00:27:45.621 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:45.621 "strip_size_kb": 0, 00:27:45.621 "state": "online", 00:27:45.621 "raid_level": "raid1", 00:27:45.621 "superblock": true, 00:27:45.621 "num_base_bdevs": 2, 00:27:45.621 "num_base_bdevs_discovered": 1, 00:27:45.621 "num_base_bdevs_operational": 1, 00:27:45.621 "base_bdevs_list": [ 00:27:45.621 { 00:27:45.621 "name": null, 00:27:45.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:45.621 "is_configured": false, 00:27:45.621 "data_offset": 256, 00:27:45.621 "data_size": 7936 00:27:45.621 }, 00:27:45.621 { 00:27:45.621 "name": "BaseBdev2", 00:27:45.621 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:45.621 "is_configured": true, 00:27:45.621 "data_offset": 256, 00:27:45.621 "data_size": 7936 00:27:45.621 } 00:27:45.621 ] 00:27:45.621 }' 00:27:45.621 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.621 16:04:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.190 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.449 "name": "raid_bdev1", 00:27:46.449 "uuid": "736d6204-73f5-48ee-a2be-80c0b5686a40", 00:27:46.449 "strip_size_kb": 0, 00:27:46.449 "state": "online", 00:27:46.449 "raid_level": "raid1", 00:27:46.449 "superblock": true, 00:27:46.449 "num_base_bdevs": 2, 00:27:46.449 "num_base_bdevs_discovered": 1, 00:27:46.449 "num_base_bdevs_operational": 1, 00:27:46.449 "base_bdevs_list": [ 00:27:46.449 { 00:27:46.449 "name": null, 00:27:46.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:46.449 "is_configured": false, 00:27:46.449 "data_offset": 256, 00:27:46.449 "data_size": 7936 00:27:46.449 }, 00:27:46.449 { 00:27:46.449 "name": "BaseBdev2", 00:27:46.449 "uuid": "c1a2d682-b110-5d63-beb8-dc6e0fb5850d", 00:27:46.449 "is_configured": true, 00:27:46.449 "data_offset": 256, 00:27:46.449 "data_size": 7936 00:27:46.449 } 00:27:46.449 ] 00:27:46.449 }' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2672623 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 2672623 ']' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 2672623 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2672623 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2672623' 00:27:46.449 killing process with pid 2672623 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 2672623 00:27:46.449 Received shutdown signal, test time was about 60.000000 seconds 00:27:46.449 00:27:46.449 Latency(us) 00:27:46.449 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:46.449 =================================================================================================================== 00:27:46.449 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:46.449 [2024-07-12 16:04:06.797735] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:46.449 [2024-07-12 16:04:06.797801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:46.449 [2024-07-12 16:04:06.797836] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:46.449 [2024-07-12 16:04:06.797842] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2961260 name raid_bdev1, state offline 00:27:46.449 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 2672623 00:27:46.449 [2024-07-12 16:04:06.816429] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:46.710 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:27:46.710 00:27:46.710 real 0m27.444s 00:27:46.710 user 0m43.051s 00:27:46.710 sys 0m3.363s 00:27:46.710 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:46.710 16:04:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:46.710 ************************************ 00:27:46.710 END TEST raid_rebuild_test_sb_md_separate 00:27:46.710 ************************************ 00:27:46.710 16:04:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:46.710 16:04:06 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:27:46.710 16:04:06 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:27:46.710 16:04:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:27:46.710 16:04:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:46.710 16:04:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:46.710 ************************************ 00:27:46.710 START TEST raid_state_function_test_sb_md_interleaved 00:27:46.710 ************************************ 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2677568 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2677568' 00:27:46.710 Process raid pid: 2677568 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2677568 /var/tmp/spdk-raid.sock 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2677568 ']' 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:46.710 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:46.710 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:46.710 [2024-07-12 16:04:07.092884] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:27:46.710 [2024-07-12 16:04:07.092943] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:46.970 [2024-07-12 16:04:07.184343] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:46.970 [2024-07-12 16:04:07.251618] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:46.970 [2024-07-12 16:04:07.303133] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:46.970 [2024-07-12 16:04:07.303157] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:47.538 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:47.538 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:27:47.538 16:04:07 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:47.797 [2024-07-12 16:04:08.094701] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:47.797 [2024-07-12 16:04:08.094734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:47.797 [2024-07-12 16:04:08.094740] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:47.797 [2024-07-12 16:04:08.094746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.797 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:48.057 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.057 "name": "Existed_Raid", 00:27:48.057 "uuid": "751f6513-466a-418a-a1c2-8bad3bc97622", 00:27:48.057 "strip_size_kb": 0, 00:27:48.057 "state": "configuring", 00:27:48.057 "raid_level": "raid1", 00:27:48.057 "superblock": true, 00:27:48.057 "num_base_bdevs": 2, 00:27:48.057 "num_base_bdevs_discovered": 0, 00:27:48.057 "num_base_bdevs_operational": 2, 00:27:48.057 "base_bdevs_list": [ 00:27:48.057 { 00:27:48.057 "name": "BaseBdev1", 00:27:48.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.057 "is_configured": false, 00:27:48.057 "data_offset": 0, 00:27:48.057 "data_size": 0 00:27:48.057 }, 00:27:48.057 { 00:27:48.057 "name": "BaseBdev2", 00:27:48.057 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:48.057 "is_configured": false, 00:27:48.057 "data_offset": 0, 00:27:48.057 "data_size": 0 00:27:48.057 } 00:27:48.057 ] 00:27:48.057 }' 00:27:48.057 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.057 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:48.626 16:04:08 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:48.626 [2024-07-12 16:04:09.020942] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:48.626 [2024-07-12 16:04:09.020959] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x158c900 name Existed_Raid, state configuring 00:27:48.626 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:48.885 [2024-07-12 16:04:09.209434] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:48.885 [2024-07-12 16:04:09.209453] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:48.885 [2024-07-12 16:04:09.209458] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:48.885 [2024-07-12 16:04:09.209464] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:48.885 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:27:49.144 [2024-07-12 16:04:09.372544] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:49.144 BaseBdev1 00:27:49.144 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:49.144 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:27:49.144 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:49.144 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:27:49.144 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:49.145 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:49.145 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:49.145 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:49.404 [ 00:27:49.404 { 00:27:49.404 "name": "BaseBdev1", 00:27:49.404 "aliases": [ 00:27:49.404 "44c3df54-d6d5-43ef-872b-fd798707960b" 00:27:49.404 ], 00:27:49.404 "product_name": "Malloc disk", 00:27:49.404 "block_size": 4128, 00:27:49.404 "num_blocks": 8192, 00:27:49.404 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:49.404 "md_size": 32, 00:27:49.404 "md_interleave": true, 00:27:49.404 "dif_type": 0, 00:27:49.404 "assigned_rate_limits": { 00:27:49.404 "rw_ios_per_sec": 0, 00:27:49.404 "rw_mbytes_per_sec": 0, 00:27:49.404 "r_mbytes_per_sec": 0, 00:27:49.404 "w_mbytes_per_sec": 0 00:27:49.404 }, 00:27:49.404 "claimed": true, 00:27:49.404 "claim_type": "exclusive_write", 00:27:49.404 "zoned": false, 00:27:49.404 "supported_io_types": { 00:27:49.404 "read": true, 00:27:49.404 "write": true, 00:27:49.404 "unmap": true, 00:27:49.404 "flush": true, 00:27:49.404 "reset": true, 00:27:49.404 "nvme_admin": false, 00:27:49.404 "nvme_io": false, 00:27:49.404 "nvme_io_md": false, 00:27:49.404 "write_zeroes": true, 00:27:49.404 "zcopy": true, 00:27:49.404 "get_zone_info": false, 00:27:49.404 "zone_management": false, 00:27:49.404 "zone_append": false, 00:27:49.404 "compare": false, 00:27:49.404 "compare_and_write": false, 00:27:49.404 "abort": true, 00:27:49.404 "seek_hole": false, 00:27:49.404 "seek_data": false, 00:27:49.404 "copy": true, 00:27:49.404 "nvme_iov_md": false 00:27:49.404 }, 00:27:49.404 "memory_domains": [ 00:27:49.404 { 00:27:49.404 "dma_device_id": "system", 00:27:49.404 "dma_device_type": 1 00:27:49.404 }, 00:27:49.404 { 00:27:49.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:49.404 "dma_device_type": 2 00:27:49.404 } 00:27:49.404 ], 00:27:49.404 "driver_specific": {} 00:27:49.404 } 00:27:49.404 ] 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.404 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:49.664 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.664 "name": "Existed_Raid", 00:27:49.664 "uuid": "6aa64329-c8ad-4b18-b1d9-e2cd6e555b6d", 00:27:49.664 "strip_size_kb": 0, 00:27:49.664 "state": "configuring", 00:27:49.664 "raid_level": "raid1", 00:27:49.664 "superblock": true, 00:27:49.664 "num_base_bdevs": 2, 00:27:49.664 "num_base_bdevs_discovered": 1, 00:27:49.664 "num_base_bdevs_operational": 2, 00:27:49.664 "base_bdevs_list": [ 00:27:49.664 { 00:27:49.664 "name": "BaseBdev1", 00:27:49.664 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:49.664 "is_configured": true, 00:27:49.664 "data_offset": 256, 00:27:49.664 "data_size": 7936 00:27:49.664 }, 00:27:49.664 { 00:27:49.664 "name": "BaseBdev2", 00:27:49.664 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.664 "is_configured": false, 00:27:49.664 "data_offset": 0, 00:27:49.664 "data_size": 0 00:27:49.664 } 00:27:49.664 ] 00:27:49.664 }' 00:27:49.664 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.664 16:04:09 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:50.233 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:50.233 [2024-07-12 16:04:10.675861] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:50.233 [2024-07-12 16:04:10.675889] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x158c1d0 name Existed_Raid, state configuring 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:50.493 [2024-07-12 16:04:10.852331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:50.493 [2024-07-12 16:04:10.853456] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:50.493 [2024-07-12 16:04:10.853479] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.493 16:04:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:50.753 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.753 "name": "Existed_Raid", 00:27:50.753 "uuid": "dc1b475a-d8cb-447c-91b2-732a0c95bf48", 00:27:50.753 "strip_size_kb": 0, 00:27:50.753 "state": "configuring", 00:27:50.753 "raid_level": "raid1", 00:27:50.753 "superblock": true, 00:27:50.753 "num_base_bdevs": 2, 00:27:50.753 "num_base_bdevs_discovered": 1, 00:27:50.753 "num_base_bdevs_operational": 2, 00:27:50.753 "base_bdevs_list": [ 00:27:50.753 { 00:27:50.753 "name": "BaseBdev1", 00:27:50.753 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:50.753 "is_configured": true, 00:27:50.753 "data_offset": 256, 00:27:50.753 "data_size": 7936 00:27:50.753 }, 00:27:50.753 { 00:27:50.753 "name": "BaseBdev2", 00:27:50.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.753 "is_configured": false, 00:27:50.753 "data_offset": 0, 00:27:50.753 "data_size": 0 00:27:50.753 } 00:27:50.753 ] 00:27:50.753 }' 00:27:50.753 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.753 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:51.322 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:27:51.589 [2024-07-12 16:04:11.783737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:51.589 [2024-07-12 16:04:11.783834] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x158e000 00:27:51.589 [2024-07-12 16:04:11.783841] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:51.589 [2024-07-12 16:04:11.783882] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x158dfd0 00:27:51.589 [2024-07-12 16:04:11.783939] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x158e000 00:27:51.590 [2024-07-12 16:04:11.783944] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x158e000 00:27:51.590 [2024-07-12 16:04:11.783985] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:51.590 BaseBdev2 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:51.590 16:04:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:51.852 [ 00:27:51.852 { 00:27:51.852 "name": "BaseBdev2", 00:27:51.852 "aliases": [ 00:27:51.852 "05379333-92ad-4a45-9ad6-8dd6758aaa02" 00:27:51.852 ], 00:27:51.852 "product_name": "Malloc disk", 00:27:51.852 "block_size": 4128, 00:27:51.852 "num_blocks": 8192, 00:27:51.852 "uuid": "05379333-92ad-4a45-9ad6-8dd6758aaa02", 00:27:51.852 "md_size": 32, 00:27:51.852 "md_interleave": true, 00:27:51.852 "dif_type": 0, 00:27:51.852 "assigned_rate_limits": { 00:27:51.852 "rw_ios_per_sec": 0, 00:27:51.852 "rw_mbytes_per_sec": 0, 00:27:51.852 "r_mbytes_per_sec": 0, 00:27:51.852 "w_mbytes_per_sec": 0 00:27:51.852 }, 00:27:51.852 "claimed": true, 00:27:51.852 "claim_type": "exclusive_write", 00:27:51.852 "zoned": false, 00:27:51.852 "supported_io_types": { 00:27:51.852 "read": true, 00:27:51.852 "write": true, 00:27:51.852 "unmap": true, 00:27:51.852 "flush": true, 00:27:51.852 "reset": true, 00:27:51.852 "nvme_admin": false, 00:27:51.852 "nvme_io": false, 00:27:51.852 "nvme_io_md": false, 00:27:51.852 "write_zeroes": true, 00:27:51.852 "zcopy": true, 00:27:51.852 "get_zone_info": false, 00:27:51.852 "zone_management": false, 00:27:51.852 "zone_append": false, 00:27:51.852 "compare": false, 00:27:51.852 "compare_and_write": false, 00:27:51.852 "abort": true, 00:27:51.852 "seek_hole": false, 00:27:51.852 "seek_data": false, 00:27:51.852 "copy": true, 00:27:51.852 "nvme_iov_md": false 00:27:51.852 }, 00:27:51.852 "memory_domains": [ 00:27:51.852 { 00:27:51.852 "dma_device_id": "system", 00:27:51.852 "dma_device_type": 1 00:27:51.852 }, 00:27:51.852 { 00:27:51.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:51.852 "dma_device_type": 2 00:27:51.852 } 00:27:51.852 ], 00:27:51.852 "driver_specific": {} 00:27:51.852 } 00:27:51.852 ] 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:51.852 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.112 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.112 "name": "Existed_Raid", 00:27:52.112 "uuid": "dc1b475a-d8cb-447c-91b2-732a0c95bf48", 00:27:52.112 "strip_size_kb": 0, 00:27:52.112 "state": "online", 00:27:52.112 "raid_level": "raid1", 00:27:52.112 "superblock": true, 00:27:52.112 "num_base_bdevs": 2, 00:27:52.112 "num_base_bdevs_discovered": 2, 00:27:52.112 "num_base_bdevs_operational": 2, 00:27:52.112 "base_bdevs_list": [ 00:27:52.112 { 00:27:52.112 "name": "BaseBdev1", 00:27:52.112 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:52.112 "is_configured": true, 00:27:52.112 "data_offset": 256, 00:27:52.112 "data_size": 7936 00:27:52.112 }, 00:27:52.112 { 00:27:52.112 "name": "BaseBdev2", 00:27:52.112 "uuid": "05379333-92ad-4a45-9ad6-8dd6758aaa02", 00:27:52.112 "is_configured": true, 00:27:52.112 "data_offset": 256, 00:27:52.112 "data_size": 7936 00:27:52.112 } 00:27:52.112 ] 00:27:52.112 }' 00:27:52.112 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.112 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:52.680 16:04:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:52.939 [2024-07-12 16:04:13.135384] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:52.939 "name": "Existed_Raid", 00:27:52.939 "aliases": [ 00:27:52.939 "dc1b475a-d8cb-447c-91b2-732a0c95bf48" 00:27:52.939 ], 00:27:52.939 "product_name": "Raid Volume", 00:27:52.939 "block_size": 4128, 00:27:52.939 "num_blocks": 7936, 00:27:52.939 "uuid": "dc1b475a-d8cb-447c-91b2-732a0c95bf48", 00:27:52.939 "md_size": 32, 00:27:52.939 "md_interleave": true, 00:27:52.939 "dif_type": 0, 00:27:52.939 "assigned_rate_limits": { 00:27:52.939 "rw_ios_per_sec": 0, 00:27:52.939 "rw_mbytes_per_sec": 0, 00:27:52.939 "r_mbytes_per_sec": 0, 00:27:52.939 "w_mbytes_per_sec": 0 00:27:52.939 }, 00:27:52.939 "claimed": false, 00:27:52.939 "zoned": false, 00:27:52.939 "supported_io_types": { 00:27:52.939 "read": true, 00:27:52.939 "write": true, 00:27:52.939 "unmap": false, 00:27:52.939 "flush": false, 00:27:52.939 "reset": true, 00:27:52.939 "nvme_admin": false, 00:27:52.939 "nvme_io": false, 00:27:52.939 "nvme_io_md": false, 00:27:52.939 "write_zeroes": true, 00:27:52.939 "zcopy": false, 00:27:52.939 "get_zone_info": false, 00:27:52.939 "zone_management": false, 00:27:52.939 "zone_append": false, 00:27:52.939 "compare": false, 00:27:52.939 "compare_and_write": false, 00:27:52.939 "abort": false, 00:27:52.939 "seek_hole": false, 00:27:52.939 "seek_data": false, 00:27:52.939 "copy": false, 00:27:52.939 "nvme_iov_md": false 00:27:52.939 }, 00:27:52.939 "memory_domains": [ 00:27:52.939 { 00:27:52.939 "dma_device_id": "system", 00:27:52.939 "dma_device_type": 1 00:27:52.939 }, 00:27:52.939 { 00:27:52.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.939 "dma_device_type": 2 00:27:52.939 }, 00:27:52.939 { 00:27:52.939 "dma_device_id": "system", 00:27:52.939 "dma_device_type": 1 00:27:52.939 }, 00:27:52.939 { 00:27:52.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:52.939 "dma_device_type": 2 00:27:52.939 } 00:27:52.939 ], 00:27:52.939 "driver_specific": { 00:27:52.939 "raid": { 00:27:52.939 "uuid": "dc1b475a-d8cb-447c-91b2-732a0c95bf48", 00:27:52.939 "strip_size_kb": 0, 00:27:52.939 "state": "online", 00:27:52.939 "raid_level": "raid1", 00:27:52.939 "superblock": true, 00:27:52.939 "num_base_bdevs": 2, 00:27:52.939 "num_base_bdevs_discovered": 2, 00:27:52.939 "num_base_bdevs_operational": 2, 00:27:52.939 "base_bdevs_list": [ 00:27:52.939 { 00:27:52.939 "name": "BaseBdev1", 00:27:52.939 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:52.939 "is_configured": true, 00:27:52.939 "data_offset": 256, 00:27:52.939 "data_size": 7936 00:27:52.939 }, 00:27:52.939 { 00:27:52.939 "name": "BaseBdev2", 00:27:52.939 "uuid": "05379333-92ad-4a45-9ad6-8dd6758aaa02", 00:27:52.939 "is_configured": true, 00:27:52.939 "data_offset": 256, 00:27:52.939 "data_size": 7936 00:27:52.939 } 00:27:52.939 ] 00:27:52.939 } 00:27:52.939 } 00:27:52.939 }' 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:52.939 BaseBdev2' 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:52.939 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:53.198 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:53.198 "name": "BaseBdev1", 00:27:53.198 "aliases": [ 00:27:53.198 "44c3df54-d6d5-43ef-872b-fd798707960b" 00:27:53.198 ], 00:27:53.198 "product_name": "Malloc disk", 00:27:53.198 "block_size": 4128, 00:27:53.198 "num_blocks": 8192, 00:27:53.198 "uuid": "44c3df54-d6d5-43ef-872b-fd798707960b", 00:27:53.198 "md_size": 32, 00:27:53.198 "md_interleave": true, 00:27:53.198 "dif_type": 0, 00:27:53.198 "assigned_rate_limits": { 00:27:53.198 "rw_ios_per_sec": 0, 00:27:53.198 "rw_mbytes_per_sec": 0, 00:27:53.198 "r_mbytes_per_sec": 0, 00:27:53.198 "w_mbytes_per_sec": 0 00:27:53.198 }, 00:27:53.198 "claimed": true, 00:27:53.198 "claim_type": "exclusive_write", 00:27:53.198 "zoned": false, 00:27:53.198 "supported_io_types": { 00:27:53.198 "read": true, 00:27:53.198 "write": true, 00:27:53.198 "unmap": true, 00:27:53.198 "flush": true, 00:27:53.198 "reset": true, 00:27:53.198 "nvme_admin": false, 00:27:53.198 "nvme_io": false, 00:27:53.198 "nvme_io_md": false, 00:27:53.198 "write_zeroes": true, 00:27:53.198 "zcopy": true, 00:27:53.198 "get_zone_info": false, 00:27:53.198 "zone_management": false, 00:27:53.198 "zone_append": false, 00:27:53.198 "compare": false, 00:27:53.198 "compare_and_write": false, 00:27:53.198 "abort": true, 00:27:53.198 "seek_hole": false, 00:27:53.198 "seek_data": false, 00:27:53.198 "copy": true, 00:27:53.198 "nvme_iov_md": false 00:27:53.198 }, 00:27:53.198 "memory_domains": [ 00:27:53.198 { 00:27:53.198 "dma_device_id": "system", 00:27:53.198 "dma_device_type": 1 00:27:53.198 }, 00:27:53.198 { 00:27:53.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.198 "dma_device_type": 2 00:27:53.198 } 00:27:53.198 ], 00:27:53.198 "driver_specific": {} 00:27:53.198 }' 00:27:53.198 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.198 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.198 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:53.198 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:53.199 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.459 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.459 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:53.459 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:53.459 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:53.459 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:53.719 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:53.719 "name": "BaseBdev2", 00:27:53.719 "aliases": [ 00:27:53.719 "05379333-92ad-4a45-9ad6-8dd6758aaa02" 00:27:53.719 ], 00:27:53.719 "product_name": "Malloc disk", 00:27:53.719 "block_size": 4128, 00:27:53.719 "num_blocks": 8192, 00:27:53.719 "uuid": "05379333-92ad-4a45-9ad6-8dd6758aaa02", 00:27:53.719 "md_size": 32, 00:27:53.719 "md_interleave": true, 00:27:53.719 "dif_type": 0, 00:27:53.719 "assigned_rate_limits": { 00:27:53.719 "rw_ios_per_sec": 0, 00:27:53.719 "rw_mbytes_per_sec": 0, 00:27:53.719 "r_mbytes_per_sec": 0, 00:27:53.719 "w_mbytes_per_sec": 0 00:27:53.719 }, 00:27:53.719 "claimed": true, 00:27:53.719 "claim_type": "exclusive_write", 00:27:53.719 "zoned": false, 00:27:53.719 "supported_io_types": { 00:27:53.719 "read": true, 00:27:53.719 "write": true, 00:27:53.719 "unmap": true, 00:27:53.719 "flush": true, 00:27:53.719 "reset": true, 00:27:53.719 "nvme_admin": false, 00:27:53.719 "nvme_io": false, 00:27:53.719 "nvme_io_md": false, 00:27:53.719 "write_zeroes": true, 00:27:53.719 "zcopy": true, 00:27:53.719 "get_zone_info": false, 00:27:53.719 "zone_management": false, 00:27:53.719 "zone_append": false, 00:27:53.719 "compare": false, 00:27:53.719 "compare_and_write": false, 00:27:53.719 "abort": true, 00:27:53.719 "seek_hole": false, 00:27:53.719 "seek_data": false, 00:27:53.719 "copy": true, 00:27:53.719 "nvme_iov_md": false 00:27:53.719 }, 00:27:53.719 "memory_domains": [ 00:27:53.719 { 00:27:53.719 "dma_device_id": "system", 00:27:53.719 "dma_device_type": 1 00:27:53.719 }, 00:27:53.719 { 00:27:53.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:53.719 "dma_device_type": 2 00:27:53.719 } 00:27:53.719 ], 00:27:53.719 "driver_specific": {} 00:27:53.719 }' 00:27:53.719 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.719 16:04:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.719 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:53.979 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:53.979 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.979 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:53.979 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:53.979 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:54.238 [2024-07-12 16:04:14.454531] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.239 "name": "Existed_Raid", 00:27:54.239 "uuid": "dc1b475a-d8cb-447c-91b2-732a0c95bf48", 00:27:54.239 "strip_size_kb": 0, 00:27:54.239 "state": "online", 00:27:54.239 "raid_level": "raid1", 00:27:54.239 "superblock": true, 00:27:54.239 "num_base_bdevs": 2, 00:27:54.239 "num_base_bdevs_discovered": 1, 00:27:54.239 "num_base_bdevs_operational": 1, 00:27:54.239 "base_bdevs_list": [ 00:27:54.239 { 00:27:54.239 "name": null, 00:27:54.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.239 "is_configured": false, 00:27:54.239 "data_offset": 256, 00:27:54.239 "data_size": 7936 00:27:54.239 }, 00:27:54.239 { 00:27:54.239 "name": "BaseBdev2", 00:27:54.239 "uuid": "05379333-92ad-4a45-9ad6-8dd6758aaa02", 00:27:54.239 "is_configured": true, 00:27:54.239 "data_offset": 256, 00:27:54.239 "data_size": 7936 00:27:54.239 } 00:27:54.239 ] 00:27:54.239 }' 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.239 16:04:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:54.807 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:54.807 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:54.807 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.807 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:55.065 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:55.065 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:55.065 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:55.324 [2024-07-12 16:04:15.545305] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:55.324 [2024-07-12 16:04:15.545366] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:55.324 [2024-07-12 16:04:15.551678] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:55.324 [2024-07-12 16:04:15.551704] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:55.324 [2024-07-12 16:04:15.551716] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x158e000 name Existed_Raid, state offline 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2677568 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2677568 ']' 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2677568 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:55.324 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2677568 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2677568' 00:27:55.583 killing process with pid 2677568 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2677568 00:27:55.583 [2024-07-12 16:04:15.792876] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2677568 00:27:55.583 [2024-07-12 16:04:15.793463] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:27:55.583 00:27:55.583 real 0m8.891s 00:27:55.583 user 0m16.157s 00:27:55.583 sys 0m1.345s 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:55.583 16:04:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:55.583 ************************************ 00:27:55.583 END TEST raid_state_function_test_sb_md_interleaved 00:27:55.583 ************************************ 00:27:55.583 16:04:15 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:55.583 16:04:15 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:27:55.583 16:04:15 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:27:55.583 16:04:15 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:55.583 16:04:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:55.583 ************************************ 00:27:55.583 START TEST raid_superblock_test_md_interleaved 00:27:55.583 ************************************ 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2679157 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2679157 /var/tmp/spdk-raid.sock 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2679157 ']' 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:55.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:55.583 16:04:15 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:55.843 [2024-07-12 16:04:16.048516] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:27:55.843 [2024-07-12 16:04:16.048563] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2679157 ] 00:27:55.843 [2024-07-12 16:04:16.136127] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:55.843 [2024-07-12 16:04:16.199548] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:55.843 [2024-07-12 16:04:16.242932] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:55.843 [2024-07-12 16:04:16.242956] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:56.781 16:04:16 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:27:56.781 malloc1 00:27:56.781 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:57.041 [2024-07-12 16:04:17.249479] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:57.041 [2024-07-12 16:04:17.249526] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.041 [2024-07-12 16:04:17.249543] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b794c0 00:27:57.041 [2024-07-12 16:04:17.249551] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.041 [2024-07-12 16:04:17.250887] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.041 [2024-07-12 16:04:17.250919] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:57.041 pt1 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:27:57.041 malloc2 00:27:57.041 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:57.301 [2024-07-12 16:04:17.663683] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:57.301 [2024-07-12 16:04:17.663726] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:57.301 [2024-07-12 16:04:17.663738] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d06d70 00:27:57.301 [2024-07-12 16:04:17.663745] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:57.301 [2024-07-12 16:04:17.665024] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:57.301 [2024-07-12 16:04:17.665051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:57.301 pt2 00:27:57.301 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:57.301 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:57.301 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:57.562 [2024-07-12 16:04:17.868228] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:57.562 [2024-07-12 16:04:17.869421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:57.562 [2024-07-12 16:04:17.869563] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf9df0 00:27:57.562 [2024-07-12 16:04:17.869573] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:57.562 [2024-07-12 16:04:17.869631] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b77530 00:27:57.562 [2024-07-12 16:04:17.869702] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf9df0 00:27:57.562 [2024-07-12 16:04:17.869708] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cf9df0 00:27:57.562 [2024-07-12 16:04:17.869763] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.562 16:04:17 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.822 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.822 "name": "raid_bdev1", 00:27:57.822 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:27:57.822 "strip_size_kb": 0, 00:27:57.822 "state": "online", 00:27:57.822 "raid_level": "raid1", 00:27:57.822 "superblock": true, 00:27:57.822 "num_base_bdevs": 2, 00:27:57.822 "num_base_bdevs_discovered": 2, 00:27:57.822 "num_base_bdevs_operational": 2, 00:27:57.822 "base_bdevs_list": [ 00:27:57.822 { 00:27:57.822 "name": "pt1", 00:27:57.822 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:57.822 "is_configured": true, 00:27:57.822 "data_offset": 256, 00:27:57.822 "data_size": 7936 00:27:57.822 }, 00:27:57.822 { 00:27:57.822 "name": "pt2", 00:27:57.822 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:57.822 "is_configured": true, 00:27:57.822 "data_offset": 256, 00:27:57.822 "data_size": 7936 00:27:57.822 } 00:27:57.822 ] 00:27:57.822 }' 00:27:57.822 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.822 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:58.391 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:58.391 [2024-07-12 16:04:18.830828] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:58.651 "name": "raid_bdev1", 00:27:58.651 "aliases": [ 00:27:58.651 "ce171729-e80c-4d2d-88ee-f7edd5fa0d58" 00:27:58.651 ], 00:27:58.651 "product_name": "Raid Volume", 00:27:58.651 "block_size": 4128, 00:27:58.651 "num_blocks": 7936, 00:27:58.651 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:27:58.651 "md_size": 32, 00:27:58.651 "md_interleave": true, 00:27:58.651 "dif_type": 0, 00:27:58.651 "assigned_rate_limits": { 00:27:58.651 "rw_ios_per_sec": 0, 00:27:58.651 "rw_mbytes_per_sec": 0, 00:27:58.651 "r_mbytes_per_sec": 0, 00:27:58.651 "w_mbytes_per_sec": 0 00:27:58.651 }, 00:27:58.651 "claimed": false, 00:27:58.651 "zoned": false, 00:27:58.651 "supported_io_types": { 00:27:58.651 "read": true, 00:27:58.651 "write": true, 00:27:58.651 "unmap": false, 00:27:58.651 "flush": false, 00:27:58.651 "reset": true, 00:27:58.651 "nvme_admin": false, 00:27:58.651 "nvme_io": false, 00:27:58.651 "nvme_io_md": false, 00:27:58.651 "write_zeroes": true, 00:27:58.651 "zcopy": false, 00:27:58.651 "get_zone_info": false, 00:27:58.651 "zone_management": false, 00:27:58.651 "zone_append": false, 00:27:58.651 "compare": false, 00:27:58.651 "compare_and_write": false, 00:27:58.651 "abort": false, 00:27:58.651 "seek_hole": false, 00:27:58.651 "seek_data": false, 00:27:58.651 "copy": false, 00:27:58.651 "nvme_iov_md": false 00:27:58.651 }, 00:27:58.651 "memory_domains": [ 00:27:58.651 { 00:27:58.651 "dma_device_id": "system", 00:27:58.651 "dma_device_type": 1 00:27:58.651 }, 00:27:58.651 { 00:27:58.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.651 "dma_device_type": 2 00:27:58.651 }, 00:27:58.651 { 00:27:58.651 "dma_device_id": "system", 00:27:58.651 "dma_device_type": 1 00:27:58.651 }, 00:27:58.651 { 00:27:58.651 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.651 "dma_device_type": 2 00:27:58.651 } 00:27:58.651 ], 00:27:58.651 "driver_specific": { 00:27:58.651 "raid": { 00:27:58.651 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:27:58.651 "strip_size_kb": 0, 00:27:58.651 "state": "online", 00:27:58.651 "raid_level": "raid1", 00:27:58.651 "superblock": true, 00:27:58.651 "num_base_bdevs": 2, 00:27:58.651 "num_base_bdevs_discovered": 2, 00:27:58.651 "num_base_bdevs_operational": 2, 00:27:58.651 "base_bdevs_list": [ 00:27:58.651 { 00:27:58.651 "name": "pt1", 00:27:58.651 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:58.651 "is_configured": true, 00:27:58.651 "data_offset": 256, 00:27:58.651 "data_size": 7936 00:27:58.651 }, 00:27:58.651 { 00:27:58.651 "name": "pt2", 00:27:58.651 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:58.651 "is_configured": true, 00:27:58.651 "data_offset": 256, 00:27:58.651 "data_size": 7936 00:27:58.651 } 00:27:58.651 ] 00:27:58.651 } 00:27:58.651 } 00:27:58.651 }' 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:58.651 pt2' 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:58.651 16:04:18 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:58.912 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:58.912 "name": "pt1", 00:27:58.912 "aliases": [ 00:27:58.912 "00000000-0000-0000-0000-000000000001" 00:27:58.912 ], 00:27:58.912 "product_name": "passthru", 00:27:58.912 "block_size": 4128, 00:27:58.913 "num_blocks": 8192, 00:27:58.913 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:58.913 "md_size": 32, 00:27:58.913 "md_interleave": true, 00:27:58.913 "dif_type": 0, 00:27:58.913 "assigned_rate_limits": { 00:27:58.913 "rw_ios_per_sec": 0, 00:27:58.913 "rw_mbytes_per_sec": 0, 00:27:58.913 "r_mbytes_per_sec": 0, 00:27:58.913 "w_mbytes_per_sec": 0 00:27:58.913 }, 00:27:58.913 "claimed": true, 00:27:58.913 "claim_type": "exclusive_write", 00:27:58.913 "zoned": false, 00:27:58.913 "supported_io_types": { 00:27:58.913 "read": true, 00:27:58.913 "write": true, 00:27:58.913 "unmap": true, 00:27:58.913 "flush": true, 00:27:58.913 "reset": true, 00:27:58.913 "nvme_admin": false, 00:27:58.913 "nvme_io": false, 00:27:58.913 "nvme_io_md": false, 00:27:58.913 "write_zeroes": true, 00:27:58.913 "zcopy": true, 00:27:58.913 "get_zone_info": false, 00:27:58.913 "zone_management": false, 00:27:58.913 "zone_append": false, 00:27:58.913 "compare": false, 00:27:58.913 "compare_and_write": false, 00:27:58.913 "abort": true, 00:27:58.913 "seek_hole": false, 00:27:58.913 "seek_data": false, 00:27:58.913 "copy": true, 00:27:58.913 "nvme_iov_md": false 00:27:58.913 }, 00:27:58.913 "memory_domains": [ 00:27:58.913 { 00:27:58.913 "dma_device_id": "system", 00:27:58.913 "dma_device_type": 1 00:27:58.913 }, 00:27:58.913 { 00:27:58.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:58.913 "dma_device_type": 2 00:27:58.913 } 00:27:58.913 ], 00:27:58.913 "driver_specific": { 00:27:58.913 "passthru": { 00:27:58.913 "name": "pt1", 00:27:58.913 "base_bdev_name": "malloc1" 00:27:58.913 } 00:27:58.913 } 00:27:58.913 }' 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:58.913 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:59.173 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:59.433 "name": "pt2", 00:27:59.433 "aliases": [ 00:27:59.433 "00000000-0000-0000-0000-000000000002" 00:27:59.433 ], 00:27:59.433 "product_name": "passthru", 00:27:59.433 "block_size": 4128, 00:27:59.433 "num_blocks": 8192, 00:27:59.433 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:59.433 "md_size": 32, 00:27:59.433 "md_interleave": true, 00:27:59.433 "dif_type": 0, 00:27:59.433 "assigned_rate_limits": { 00:27:59.433 "rw_ios_per_sec": 0, 00:27:59.433 "rw_mbytes_per_sec": 0, 00:27:59.433 "r_mbytes_per_sec": 0, 00:27:59.433 "w_mbytes_per_sec": 0 00:27:59.433 }, 00:27:59.433 "claimed": true, 00:27:59.433 "claim_type": "exclusive_write", 00:27:59.433 "zoned": false, 00:27:59.433 "supported_io_types": { 00:27:59.433 "read": true, 00:27:59.433 "write": true, 00:27:59.433 "unmap": true, 00:27:59.433 "flush": true, 00:27:59.433 "reset": true, 00:27:59.433 "nvme_admin": false, 00:27:59.433 "nvme_io": false, 00:27:59.433 "nvme_io_md": false, 00:27:59.433 "write_zeroes": true, 00:27:59.433 "zcopy": true, 00:27:59.433 "get_zone_info": false, 00:27:59.433 "zone_management": false, 00:27:59.433 "zone_append": false, 00:27:59.433 "compare": false, 00:27:59.433 "compare_and_write": false, 00:27:59.433 "abort": true, 00:27:59.433 "seek_hole": false, 00:27:59.433 "seek_data": false, 00:27:59.433 "copy": true, 00:27:59.433 "nvme_iov_md": false 00:27:59.433 }, 00:27:59.433 "memory_domains": [ 00:27:59.433 { 00:27:59.433 "dma_device_id": "system", 00:27:59.433 "dma_device_type": 1 00:27:59.433 }, 00:27:59.433 { 00:27:59.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:59.433 "dma_device_type": 2 00:27:59.433 } 00:27:59.433 ], 00:27:59.433 "driver_specific": { 00:27:59.433 "passthru": { 00:27:59.433 "name": "pt2", 00:27:59.433 "base_bdev_name": "malloc2" 00:27:59.433 } 00:27:59.433 } 00:27:59.433 }' 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:59.433 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.693 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:59.693 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:59.693 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:59.693 16:04:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:59.693 [2024-07-12 16:04:20.106072] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:59.693 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=ce171729-e80c-4d2d-88ee-f7edd5fa0d58 00:27:59.693 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z ce171729-e80c-4d2d-88ee-f7edd5fa0d58 ']' 00:27:59.693 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:59.952 [2024-07-12 16:04:20.298330] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:59.952 [2024-07-12 16:04:20.298346] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:59.952 [2024-07-12 16:04:20.298385] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:59.952 [2024-07-12 16:04:20.298426] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:59.952 [2024-07-12 16:04:20.298432] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf9df0 name raid_bdev1, state offline 00:27:59.952 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.952 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:00.212 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:00.212 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:00.212 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:00.212 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:00.472 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:00.472 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:00.472 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:00.472 16:04:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:00.731 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:00.731 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:00.732 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:00.992 [2024-07-12 16:04:21.280777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:00.992 [2024-07-12 16:04:21.281844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:00.992 [2024-07-12 16:04:21.281887] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:00.992 [2024-07-12 16:04:21.281913] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:00.992 [2024-07-12 16:04:21.281924] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:00.992 [2024-07-12 16:04:21.281929] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b79960 name raid_bdev1, state configuring 00:28:00.992 request: 00:28:00.992 { 00:28:00.992 "name": "raid_bdev1", 00:28:00.992 "raid_level": "raid1", 00:28:00.992 "base_bdevs": [ 00:28:00.992 "malloc1", 00:28:00.992 "malloc2" 00:28:00.992 ], 00:28:00.992 "superblock": false, 00:28:00.992 "method": "bdev_raid_create", 00:28:00.992 "req_id": 1 00:28:00.992 } 00:28:00.992 Got JSON-RPC error response 00:28:00.992 response: 00:28:00.992 { 00:28:00.992 "code": -17, 00:28:00.992 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:00.992 } 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.992 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:01.252 [2024-07-12 16:04:21.665714] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:01.252 [2024-07-12 16:04:21.665736] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:01.252 [2024-07-12 16:04:21.665747] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b796f0 00:28:01.252 [2024-07-12 16:04:21.665752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:01.252 [2024-07-12 16:04:21.666834] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:01.252 [2024-07-12 16:04:21.666851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:01.252 [2024-07-12 16:04:21.666880] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:01.252 [2024-07-12 16:04:21.666897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:01.252 pt1 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.252 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.512 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.512 "name": "raid_bdev1", 00:28:01.512 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:01.512 "strip_size_kb": 0, 00:28:01.512 "state": "configuring", 00:28:01.512 "raid_level": "raid1", 00:28:01.512 "superblock": true, 00:28:01.512 "num_base_bdevs": 2, 00:28:01.512 "num_base_bdevs_discovered": 1, 00:28:01.512 "num_base_bdevs_operational": 2, 00:28:01.512 "base_bdevs_list": [ 00:28:01.512 { 00:28:01.512 "name": "pt1", 00:28:01.512 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:01.512 "is_configured": true, 00:28:01.512 "data_offset": 256, 00:28:01.512 "data_size": 7936 00:28:01.512 }, 00:28:01.512 { 00:28:01.512 "name": null, 00:28:01.512 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:01.512 "is_configured": false, 00:28:01.512 "data_offset": 256, 00:28:01.512 "data_size": 7936 00:28:01.512 } 00:28:01.512 ] 00:28:01.512 }' 00:28:01.512 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.512 16:04:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:02.081 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:02.081 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:02.081 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:02.081 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:02.342 [2024-07-12 16:04:22.608097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:02.342 [2024-07-12 16:04:22.608122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:02.342 [2024-07-12 16:04:22.608133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfa860 00:28:02.342 [2024-07-12 16:04:22.608139] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:02.342 [2024-07-12 16:04:22.608241] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:02.342 [2024-07-12 16:04:22.608249] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:02.342 [2024-07-12 16:04:22.608274] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:02.342 [2024-07-12 16:04:22.608284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:02.342 [2024-07-12 16:04:22.608344] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b77c30 00:28:02.342 [2024-07-12 16:04:22.608350] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:02.342 [2024-07-12 16:04:22.608391] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfa210 00:28:02.342 [2024-07-12 16:04:22.608448] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b77c30 00:28:02.342 [2024-07-12 16:04:22.608453] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b77c30 00:28:02.342 [2024-07-12 16:04:22.608496] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.342 pt2 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.342 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.601 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:02.601 "name": "raid_bdev1", 00:28:02.601 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:02.601 "strip_size_kb": 0, 00:28:02.601 "state": "online", 00:28:02.601 "raid_level": "raid1", 00:28:02.601 "superblock": true, 00:28:02.601 "num_base_bdevs": 2, 00:28:02.601 "num_base_bdevs_discovered": 2, 00:28:02.601 "num_base_bdevs_operational": 2, 00:28:02.601 "base_bdevs_list": [ 00:28:02.601 { 00:28:02.601 "name": "pt1", 00:28:02.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:02.601 "is_configured": true, 00:28:02.601 "data_offset": 256, 00:28:02.601 "data_size": 7936 00:28:02.601 }, 00:28:02.601 { 00:28:02.601 "name": "pt2", 00:28:02.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:02.601 "is_configured": true, 00:28:02.601 "data_offset": 256, 00:28:02.601 "data_size": 7936 00:28:02.601 } 00:28:02.601 ] 00:28:02.601 }' 00:28:02.601 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:02.601 16:04:22 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:03.170 [2024-07-12 16:04:23.582773] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:03.170 "name": "raid_bdev1", 00:28:03.170 "aliases": [ 00:28:03.170 "ce171729-e80c-4d2d-88ee-f7edd5fa0d58" 00:28:03.170 ], 00:28:03.170 "product_name": "Raid Volume", 00:28:03.170 "block_size": 4128, 00:28:03.170 "num_blocks": 7936, 00:28:03.170 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:03.170 "md_size": 32, 00:28:03.170 "md_interleave": true, 00:28:03.170 "dif_type": 0, 00:28:03.170 "assigned_rate_limits": { 00:28:03.170 "rw_ios_per_sec": 0, 00:28:03.170 "rw_mbytes_per_sec": 0, 00:28:03.170 "r_mbytes_per_sec": 0, 00:28:03.170 "w_mbytes_per_sec": 0 00:28:03.170 }, 00:28:03.170 "claimed": false, 00:28:03.170 "zoned": false, 00:28:03.170 "supported_io_types": { 00:28:03.170 "read": true, 00:28:03.170 "write": true, 00:28:03.170 "unmap": false, 00:28:03.170 "flush": false, 00:28:03.170 "reset": true, 00:28:03.170 "nvme_admin": false, 00:28:03.170 "nvme_io": false, 00:28:03.170 "nvme_io_md": false, 00:28:03.170 "write_zeroes": true, 00:28:03.170 "zcopy": false, 00:28:03.170 "get_zone_info": false, 00:28:03.170 "zone_management": false, 00:28:03.170 "zone_append": false, 00:28:03.170 "compare": false, 00:28:03.170 "compare_and_write": false, 00:28:03.170 "abort": false, 00:28:03.170 "seek_hole": false, 00:28:03.170 "seek_data": false, 00:28:03.170 "copy": false, 00:28:03.170 "nvme_iov_md": false 00:28:03.170 }, 00:28:03.170 "memory_domains": [ 00:28:03.170 { 00:28:03.170 "dma_device_id": "system", 00:28:03.170 "dma_device_type": 1 00:28:03.170 }, 00:28:03.170 { 00:28:03.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.170 "dma_device_type": 2 00:28:03.170 }, 00:28:03.170 { 00:28:03.170 "dma_device_id": "system", 00:28:03.170 "dma_device_type": 1 00:28:03.170 }, 00:28:03.170 { 00:28:03.170 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.170 "dma_device_type": 2 00:28:03.170 } 00:28:03.170 ], 00:28:03.170 "driver_specific": { 00:28:03.170 "raid": { 00:28:03.170 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:03.170 "strip_size_kb": 0, 00:28:03.170 "state": "online", 00:28:03.170 "raid_level": "raid1", 00:28:03.170 "superblock": true, 00:28:03.170 "num_base_bdevs": 2, 00:28:03.170 "num_base_bdevs_discovered": 2, 00:28:03.170 "num_base_bdevs_operational": 2, 00:28:03.170 "base_bdevs_list": [ 00:28:03.170 { 00:28:03.170 "name": "pt1", 00:28:03.170 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.170 "is_configured": true, 00:28:03.170 "data_offset": 256, 00:28:03.170 "data_size": 7936 00:28:03.170 }, 00:28:03.170 { 00:28:03.170 "name": "pt2", 00:28:03.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.170 "is_configured": true, 00:28:03.170 "data_offset": 256, 00:28:03.170 "data_size": 7936 00:28:03.170 } 00:28:03.170 ] 00:28:03.170 } 00:28:03.170 } 00:28:03.170 }' 00:28:03.170 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:03.431 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:03.431 pt2' 00:28:03.431 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:03.431 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:03.431 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:03.431 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:03.431 "name": "pt1", 00:28:03.431 "aliases": [ 00:28:03.431 "00000000-0000-0000-0000-000000000001" 00:28:03.431 ], 00:28:03.431 "product_name": "passthru", 00:28:03.431 "block_size": 4128, 00:28:03.431 "num_blocks": 8192, 00:28:03.431 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:03.431 "md_size": 32, 00:28:03.431 "md_interleave": true, 00:28:03.431 "dif_type": 0, 00:28:03.431 "assigned_rate_limits": { 00:28:03.431 "rw_ios_per_sec": 0, 00:28:03.431 "rw_mbytes_per_sec": 0, 00:28:03.431 "r_mbytes_per_sec": 0, 00:28:03.431 "w_mbytes_per_sec": 0 00:28:03.431 }, 00:28:03.431 "claimed": true, 00:28:03.431 "claim_type": "exclusive_write", 00:28:03.431 "zoned": false, 00:28:03.431 "supported_io_types": { 00:28:03.431 "read": true, 00:28:03.431 "write": true, 00:28:03.432 "unmap": true, 00:28:03.432 "flush": true, 00:28:03.432 "reset": true, 00:28:03.432 "nvme_admin": false, 00:28:03.432 "nvme_io": false, 00:28:03.432 "nvme_io_md": false, 00:28:03.432 "write_zeroes": true, 00:28:03.432 "zcopy": true, 00:28:03.432 "get_zone_info": false, 00:28:03.432 "zone_management": false, 00:28:03.432 "zone_append": false, 00:28:03.432 "compare": false, 00:28:03.432 "compare_and_write": false, 00:28:03.432 "abort": true, 00:28:03.432 "seek_hole": false, 00:28:03.432 "seek_data": false, 00:28:03.432 "copy": true, 00:28:03.432 "nvme_iov_md": false 00:28:03.432 }, 00:28:03.432 "memory_domains": [ 00:28:03.432 { 00:28:03.432 "dma_device_id": "system", 00:28:03.432 "dma_device_type": 1 00:28:03.432 }, 00:28:03.432 { 00:28:03.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.432 "dma_device_type": 2 00:28:03.432 } 00:28:03.432 ], 00:28:03.432 "driver_specific": { 00:28:03.432 "passthru": { 00:28:03.432 "name": "pt1", 00:28:03.432 "base_bdev_name": "malloc1" 00:28:03.432 } 00:28:03.432 } 00:28:03.432 }' 00:28:03.432 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:03.432 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:03.693 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:03.693 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:03.693 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:03.693 16:04:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:03.693 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:03.693 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:03.693 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:03.693 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:03.693 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:03.953 "name": "pt2", 00:28:03.953 "aliases": [ 00:28:03.953 "00000000-0000-0000-0000-000000000002" 00:28:03.953 ], 00:28:03.953 "product_name": "passthru", 00:28:03.953 "block_size": 4128, 00:28:03.953 "num_blocks": 8192, 00:28:03.953 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:03.953 "md_size": 32, 00:28:03.953 "md_interleave": true, 00:28:03.953 "dif_type": 0, 00:28:03.953 "assigned_rate_limits": { 00:28:03.953 "rw_ios_per_sec": 0, 00:28:03.953 "rw_mbytes_per_sec": 0, 00:28:03.953 "r_mbytes_per_sec": 0, 00:28:03.953 "w_mbytes_per_sec": 0 00:28:03.953 }, 00:28:03.953 "claimed": true, 00:28:03.953 "claim_type": "exclusive_write", 00:28:03.953 "zoned": false, 00:28:03.953 "supported_io_types": { 00:28:03.953 "read": true, 00:28:03.953 "write": true, 00:28:03.953 "unmap": true, 00:28:03.953 "flush": true, 00:28:03.953 "reset": true, 00:28:03.953 "nvme_admin": false, 00:28:03.953 "nvme_io": false, 00:28:03.953 "nvme_io_md": false, 00:28:03.953 "write_zeroes": true, 00:28:03.953 "zcopy": true, 00:28:03.953 "get_zone_info": false, 00:28:03.953 "zone_management": false, 00:28:03.953 "zone_append": false, 00:28:03.953 "compare": false, 00:28:03.953 "compare_and_write": false, 00:28:03.953 "abort": true, 00:28:03.953 "seek_hole": false, 00:28:03.953 "seek_data": false, 00:28:03.953 "copy": true, 00:28:03.953 "nvme_iov_md": false 00:28:03.953 }, 00:28:03.953 "memory_domains": [ 00:28:03.953 { 00:28:03.953 "dma_device_id": "system", 00:28:03.953 "dma_device_type": 1 00:28:03.953 }, 00:28:03.953 { 00:28:03.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:03.953 "dma_device_type": 2 00:28:03.953 } 00:28:03.953 ], 00:28:03.953 "driver_specific": { 00:28:03.953 "passthru": { 00:28:03.953 "name": "pt2", 00:28:03.953 "base_bdev_name": "malloc2" 00:28:03.953 } 00:28:03.953 } 00:28:03.953 }' 00:28:03.953 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:28:04.214 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.475 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:04.475 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:28:04.475 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:04.475 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:04.475 [2024-07-12 16:04:24.914144] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:04.735 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' ce171729-e80c-4d2d-88ee-f7edd5fa0d58 '!=' ce171729-e80c-4d2d-88ee-f7edd5fa0d58 ']' 00:28:04.735 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:04.735 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:04.735 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:28:04.735 16:04:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:04.735 [2024-07-12 16:04:25.086395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.735 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:04.995 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:04.995 "name": "raid_bdev1", 00:28:04.995 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:04.995 "strip_size_kb": 0, 00:28:04.995 "state": "online", 00:28:04.995 "raid_level": "raid1", 00:28:04.995 "superblock": true, 00:28:04.995 "num_base_bdevs": 2, 00:28:04.995 "num_base_bdevs_discovered": 1, 00:28:04.995 "num_base_bdevs_operational": 1, 00:28:04.995 "base_bdevs_list": [ 00:28:04.995 { 00:28:04.995 "name": null, 00:28:04.995 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:04.995 "is_configured": false, 00:28:04.995 "data_offset": 256, 00:28:04.995 "data_size": 7936 00:28:04.995 }, 00:28:04.995 { 00:28:04.995 "name": "pt2", 00:28:04.995 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:04.995 "is_configured": true, 00:28:04.995 "data_offset": 256, 00:28:04.995 "data_size": 7936 00:28:04.995 } 00:28:04.995 ] 00:28:04.995 }' 00:28:04.995 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:04.995 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:05.565 16:04:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:05.825 [2024-07-12 16:04:26.060839] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:05.825 [2024-07-12 16:04:26.060853] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:05.825 [2024-07-12 16:04:26.060880] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:05.825 [2024-07-12 16:04:26.060908] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:05.825 [2024-07-12 16:04:26.060914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b77c30 name raid_bdev1, state offline 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:05.825 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:28:06.085 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:06.345 [2024-07-12 16:04:26.638274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:06.345 [2024-07-12 16:04:26.638295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:06.345 [2024-07-12 16:04:26.638306] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfb9b0 00:28:06.345 [2024-07-12 16:04:26.638312] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:06.345 [2024-07-12 16:04:26.639421] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:06.345 [2024-07-12 16:04:26.639437] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:06.345 [2024-07-12 16:04:26.639468] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:06.345 [2024-07-12 16:04:26.639486] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:06.345 [2024-07-12 16:04:26.639534] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cfc110 00:28:06.345 [2024-07-12 16:04:26.639539] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:06.345 [2024-07-12 16:04:26.639579] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfcc70 00:28:06.345 [2024-07-12 16:04:26.639636] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cfc110 00:28:06.345 [2024-07-12 16:04:26.639641] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cfc110 00:28:06.345 [2024-07-12 16:04:26.639678] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:06.345 pt2 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.345 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.605 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:06.605 "name": "raid_bdev1", 00:28:06.605 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:06.605 "strip_size_kb": 0, 00:28:06.605 "state": "online", 00:28:06.605 "raid_level": "raid1", 00:28:06.605 "superblock": true, 00:28:06.605 "num_base_bdevs": 2, 00:28:06.605 "num_base_bdevs_discovered": 1, 00:28:06.605 "num_base_bdevs_operational": 1, 00:28:06.605 "base_bdevs_list": [ 00:28:06.605 { 00:28:06.605 "name": null, 00:28:06.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.605 "is_configured": false, 00:28:06.605 "data_offset": 256, 00:28:06.605 "data_size": 7936 00:28:06.605 }, 00:28:06.605 { 00:28:06.605 "name": "pt2", 00:28:06.605 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:06.605 "is_configured": true, 00:28:06.605 "data_offset": 256, 00:28:06.605 "data_size": 7936 00:28:06.605 } 00:28:06.605 ] 00:28:06.605 }' 00:28:06.605 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:06.605 16:04:26 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:07.176 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:07.176 [2024-07-12 16:04:27.564592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:07.176 [2024-07-12 16:04:27.564603] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:07.176 [2024-07-12 16:04:27.564634] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:07.176 [2024-07-12 16:04:27.564663] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:07.176 [2024-07-12 16:04:27.564668] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfc110 name raid_bdev1, state offline 00:28:07.176 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.176 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:07.436 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:07.436 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:07.436 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:07.436 16:04:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:07.697 [2024-07-12 16:04:27.981631] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:07.697 [2024-07-12 16:04:27.981653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:07.697 [2024-07-12 16:04:27.981662] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b776a0 00:28:07.697 [2024-07-12 16:04:27.981668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:07.697 [2024-07-12 16:04:27.982766] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:07.697 [2024-07-12 16:04:27.982782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:07.697 [2024-07-12 16:04:27.982811] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:07.697 [2024-07-12 16:04:27.982827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:07.697 [2024-07-12 16:04:27.982885] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:07.697 [2024-07-12 16:04:27.982892] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:07.697 [2024-07-12 16:04:27.982900] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfcf30 name raid_bdev1, state configuring 00:28:07.697 [2024-07-12 16:04:27.982913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:07.697 [2024-07-12 16:04:27.982952] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cfd1b0 00:28:07.697 [2024-07-12 16:04:27.982957] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:07.697 [2024-07-12 16:04:27.982995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b6ed70 00:28:07.697 [2024-07-12 16:04:27.983050] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cfd1b0 00:28:07.697 [2024-07-12 16:04:27.983055] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cfd1b0 00:28:07.697 [2024-07-12 16:04:27.983098] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:07.697 pt1 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.697 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.957 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.958 "name": "raid_bdev1", 00:28:07.958 "uuid": "ce171729-e80c-4d2d-88ee-f7edd5fa0d58", 00:28:07.958 "strip_size_kb": 0, 00:28:07.958 "state": "online", 00:28:07.958 "raid_level": "raid1", 00:28:07.958 "superblock": true, 00:28:07.958 "num_base_bdevs": 2, 00:28:07.958 "num_base_bdevs_discovered": 1, 00:28:07.958 "num_base_bdevs_operational": 1, 00:28:07.958 "base_bdevs_list": [ 00:28:07.958 { 00:28:07.958 "name": null, 00:28:07.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.958 "is_configured": false, 00:28:07.958 "data_offset": 256, 00:28:07.958 "data_size": 7936 00:28:07.958 }, 00:28:07.958 { 00:28:07.958 "name": "pt2", 00:28:07.958 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:07.958 "is_configured": true, 00:28:07.958 "data_offset": 256, 00:28:07.958 "data_size": 7936 00:28:07.958 } 00:28:07.958 ] 00:28:07.958 }' 00:28:07.958 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.958 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:08.563 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:08.563 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:08.563 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:08.563 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:08.563 16:04:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:08.824 [2024-07-12 16:04:29.148779] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' ce171729-e80c-4d2d-88ee-f7edd5fa0d58 '!=' ce171729-e80c-4d2d-88ee-f7edd5fa0d58 ']' 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2679157 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2679157 ']' 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2679157 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2679157 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2679157' 00:28:08.824 killing process with pid 2679157 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 2679157 00:28:08.824 [2024-07-12 16:04:29.234688] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:08.824 [2024-07-12 16:04:29.234727] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:08.824 [2024-07-12 16:04:29.234758] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:08.824 [2024-07-12 16:04:29.234763] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cfd1b0 name raid_bdev1, state offline 00:28:08.824 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 2679157 00:28:08.824 [2024-07-12 16:04:29.243788] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:09.085 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:28:09.085 00:28:09.085 real 0m13.371s 00:28:09.085 user 0m24.751s 00:28:09.085 sys 0m2.032s 00:28:09.085 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:09.085 16:04:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:09.085 ************************************ 00:28:09.085 END TEST raid_superblock_test_md_interleaved 00:28:09.085 ************************************ 00:28:09.085 16:04:29 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:09.085 16:04:29 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:28:09.085 16:04:29 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:09.085 16:04:29 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:09.085 16:04:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:09.085 ************************************ 00:28:09.085 START TEST raid_rebuild_test_sb_md_interleaved 00:28:09.085 ************************************ 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:09.085 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2681822 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2681822 /var/tmp/spdk-raid.sock 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 2681822 ']' 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:09.086 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:09.086 16:04:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:09.086 [2024-07-12 16:04:29.505079] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:09.086 [2024-07-12 16:04:29.505147] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2681822 ] 00:28:09.086 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:09.086 Zero copy mechanism will not be used. 00:28:09.346 [2024-07-12 16:04:29.598292] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:09.346 [2024-07-12 16:04:29.690554] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:09.346 [2024-07-12 16:04:29.746269] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:09.346 [2024-07-12 16:04:29.746304] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:09.918 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:09.918 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:28:09.918 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:09.918 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:28:10.179 BaseBdev1_malloc 00:28:10.179 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:10.439 [2024-07-12 16:04:30.723606] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:10.439 [2024-07-12 16:04:30.723662] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.439 [2024-07-12 16:04:30.723680] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x139c980 00:28:10.439 [2024-07-12 16:04:30.723688] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.439 [2024-07-12 16:04:30.725057] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.439 [2024-07-12 16:04:30.725090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:10.439 BaseBdev1 00:28:10.439 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:10.439 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:28:10.700 BaseBdev2_malloc 00:28:10.700 16:04:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:10.700 [2024-07-12 16:04:31.138362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:10.700 [2024-07-12 16:04:31.138407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.700 [2024-07-12 16:04:31.138424] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x152a230 00:28:10.700 [2024-07-12 16:04:31.138431] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.700 [2024-07-12 16:04:31.139735] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.700 [2024-07-12 16:04:31.139769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:10.700 BaseBdev2 00:28:10.960 16:04:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:28:10.960 spare_malloc 00:28:10.960 16:04:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:11.219 spare_delay 00:28:11.219 16:04:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:11.478 [2024-07-12 16:04:31.781671] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:11.478 [2024-07-12 16:04:31.781731] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:11.478 [2024-07-12 16:04:31.781746] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x152ab00 00:28:11.478 [2024-07-12 16:04:31.781753] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:11.478 [2024-07-12 16:04:31.783000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:11.478 [2024-07-12 16:04:31.783032] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:11.478 spare 00:28:11.478 16:04:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:11.738 [2024-07-12 16:04:31.998231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:11.738 [2024-07-12 16:04:31.999374] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:11.738 [2024-07-12 16:04:31.999529] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x151ecc0 00:28:11.738 [2024-07-12 16:04:31.999539] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:11.738 [2024-07-12 16:04:31.999602] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15289f0 00:28:11.738 [2024-07-12 16:04:31.999680] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x151ecc0 00:28:11.738 [2024-07-12 16:04:31.999685] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x151ecc0 00:28:11.738 [2024-07-12 16:04:31.999739] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.738 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.998 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.998 "name": "raid_bdev1", 00:28:11.998 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:11.998 "strip_size_kb": 0, 00:28:11.998 "state": "online", 00:28:11.998 "raid_level": "raid1", 00:28:11.998 "superblock": true, 00:28:11.998 "num_base_bdevs": 2, 00:28:11.998 "num_base_bdevs_discovered": 2, 00:28:11.998 "num_base_bdevs_operational": 2, 00:28:11.998 "base_bdevs_list": [ 00:28:11.998 { 00:28:11.998 "name": "BaseBdev1", 00:28:11.998 "uuid": "00bbcd97-0cdc-5cda-913c-d6e760910fa4", 00:28:11.998 "is_configured": true, 00:28:11.998 "data_offset": 256, 00:28:11.998 "data_size": 7936 00:28:11.998 }, 00:28:11.998 { 00:28:11.998 "name": "BaseBdev2", 00:28:11.998 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:11.998 "is_configured": true, 00:28:11.998 "data_offset": 256, 00:28:11.998 "data_size": 7936 00:28:11.998 } 00:28:11.998 ] 00:28:11.998 }' 00:28:11.998 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.998 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:12.567 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:12.567 16:04:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:12.567 [2024-07-12 16:04:32.984938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:12.567 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:28:12.828 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:13.088 [2024-07-12 16:04:33.405757] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.088 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.348 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:13.348 "name": "raid_bdev1", 00:28:13.348 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:13.348 "strip_size_kb": 0, 00:28:13.348 "state": "online", 00:28:13.348 "raid_level": "raid1", 00:28:13.348 "superblock": true, 00:28:13.348 "num_base_bdevs": 2, 00:28:13.348 "num_base_bdevs_discovered": 1, 00:28:13.348 "num_base_bdevs_operational": 1, 00:28:13.348 "base_bdevs_list": [ 00:28:13.348 { 00:28:13.348 "name": null, 00:28:13.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.348 "is_configured": false, 00:28:13.348 "data_offset": 256, 00:28:13.348 "data_size": 7936 00:28:13.348 }, 00:28:13.348 { 00:28:13.348 "name": "BaseBdev2", 00:28:13.348 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:13.348 "is_configured": true, 00:28:13.348 "data_offset": 256, 00:28:13.348 "data_size": 7936 00:28:13.348 } 00:28:13.348 ] 00:28:13.348 }' 00:28:13.348 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:13.348 16:04:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:13.916 16:04:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:14.175 [2024-07-12 16:04:34.496535] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:14.175 [2024-07-12 16:04:34.499007] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13936a0 00:28:14.175 [2024-07-12 16:04:34.500532] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:14.175 16:04:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.114 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:15.373 "name": "raid_bdev1", 00:28:15.373 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:15.373 "strip_size_kb": 0, 00:28:15.373 "state": "online", 00:28:15.373 "raid_level": "raid1", 00:28:15.373 "superblock": true, 00:28:15.373 "num_base_bdevs": 2, 00:28:15.373 "num_base_bdevs_discovered": 2, 00:28:15.373 "num_base_bdevs_operational": 2, 00:28:15.373 "process": { 00:28:15.373 "type": "rebuild", 00:28:15.373 "target": "spare", 00:28:15.373 "progress": { 00:28:15.373 "blocks": 2816, 00:28:15.373 "percent": 35 00:28:15.373 } 00:28:15.373 }, 00:28:15.373 "base_bdevs_list": [ 00:28:15.373 { 00:28:15.373 "name": "spare", 00:28:15.373 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:15.373 "is_configured": true, 00:28:15.373 "data_offset": 256, 00:28:15.373 "data_size": 7936 00:28:15.373 }, 00:28:15.373 { 00:28:15.373 "name": "BaseBdev2", 00:28:15.373 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:15.373 "is_configured": true, 00:28:15.373 "data_offset": 256, 00:28:15.373 "data_size": 7936 00:28:15.373 } 00:28:15.373 ] 00:28:15.373 }' 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:15.373 16:04:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:15.632 [2024-07-12 16:04:35.945319] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.632 [2024-07-12 16:04:36.009481] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:15.632 [2024-07-12 16:04:36.009514] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:15.632 [2024-07-12 16:04:36.009524] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:15.632 [2024-07-12 16:04:36.009528] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.632 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.892 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.892 "name": "raid_bdev1", 00:28:15.892 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:15.892 "strip_size_kb": 0, 00:28:15.892 "state": "online", 00:28:15.892 "raid_level": "raid1", 00:28:15.892 "superblock": true, 00:28:15.892 "num_base_bdevs": 2, 00:28:15.892 "num_base_bdevs_discovered": 1, 00:28:15.892 "num_base_bdevs_operational": 1, 00:28:15.892 "base_bdevs_list": [ 00:28:15.892 { 00:28:15.892 "name": null, 00:28:15.892 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.892 "is_configured": false, 00:28:15.892 "data_offset": 256, 00:28:15.892 "data_size": 7936 00:28:15.892 }, 00:28:15.892 { 00:28:15.892 "name": "BaseBdev2", 00:28:15.892 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:15.892 "is_configured": true, 00:28:15.892 "data_offset": 256, 00:28:15.892 "data_size": 7936 00:28:15.892 } 00:28:15.892 ] 00:28:15.892 }' 00:28:15.892 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.892 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:16.459 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:16.459 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:16.460 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:16.460 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:16.460 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:16.460 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:16.460 16:04:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.028 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:17.028 "name": "raid_bdev1", 00:28:17.028 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:17.028 "strip_size_kb": 0, 00:28:17.028 "state": "online", 00:28:17.028 "raid_level": "raid1", 00:28:17.028 "superblock": true, 00:28:17.028 "num_base_bdevs": 2, 00:28:17.028 "num_base_bdevs_discovered": 1, 00:28:17.028 "num_base_bdevs_operational": 1, 00:28:17.028 "base_bdevs_list": [ 00:28:17.028 { 00:28:17.028 "name": null, 00:28:17.028 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.028 "is_configured": false, 00:28:17.028 "data_offset": 256, 00:28:17.028 "data_size": 7936 00:28:17.028 }, 00:28:17.028 { 00:28:17.028 "name": "BaseBdev2", 00:28:17.028 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:17.028 "is_configured": true, 00:28:17.028 "data_offset": 256, 00:28:17.028 "data_size": 7936 00:28:17.028 } 00:28:17.028 ] 00:28:17.028 }' 00:28:17.028 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:17.287 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:17.287 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:17.287 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:17.287 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:17.287 [2024-07-12 16:04:37.725846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:17.287 [2024-07-12 16:04:37.728339] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1393960 00:28:17.287 [2024-07-12 16:04:37.729470] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:17.546 16:04:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.483 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.742 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:18.742 "name": "raid_bdev1", 00:28:18.742 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:18.742 "strip_size_kb": 0, 00:28:18.742 "state": "online", 00:28:18.742 "raid_level": "raid1", 00:28:18.742 "superblock": true, 00:28:18.742 "num_base_bdevs": 2, 00:28:18.742 "num_base_bdevs_discovered": 2, 00:28:18.742 "num_base_bdevs_operational": 2, 00:28:18.742 "process": { 00:28:18.742 "type": "rebuild", 00:28:18.742 "target": "spare", 00:28:18.742 "progress": { 00:28:18.742 "blocks": 3072, 00:28:18.742 "percent": 38 00:28:18.742 } 00:28:18.742 }, 00:28:18.742 "base_bdevs_list": [ 00:28:18.742 { 00:28:18.742 "name": "spare", 00:28:18.742 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:18.742 "is_configured": true, 00:28:18.742 "data_offset": 256, 00:28:18.742 "data_size": 7936 00:28:18.742 }, 00:28:18.742 { 00:28:18.742 "name": "BaseBdev2", 00:28:18.742 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:18.742 "is_configured": true, 00:28:18.742 "data_offset": 256, 00:28:18.742 "data_size": 7936 00:28:18.742 } 00:28:18.742 ] 00:28:18.742 }' 00:28:18.742 16:04:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:28:18.742 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1025 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.742 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.002 "name": "raid_bdev1", 00:28:19.002 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:19.002 "strip_size_kb": 0, 00:28:19.002 "state": "online", 00:28:19.002 "raid_level": "raid1", 00:28:19.002 "superblock": true, 00:28:19.002 "num_base_bdevs": 2, 00:28:19.002 "num_base_bdevs_discovered": 2, 00:28:19.002 "num_base_bdevs_operational": 2, 00:28:19.002 "process": { 00:28:19.002 "type": "rebuild", 00:28:19.002 "target": "spare", 00:28:19.002 "progress": { 00:28:19.002 "blocks": 3584, 00:28:19.002 "percent": 45 00:28:19.002 } 00:28:19.002 }, 00:28:19.002 "base_bdevs_list": [ 00:28:19.002 { 00:28:19.002 "name": "spare", 00:28:19.002 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:19.002 "is_configured": true, 00:28:19.002 "data_offset": 256, 00:28:19.002 "data_size": 7936 00:28:19.002 }, 00:28:19.002 { 00:28:19.002 "name": "BaseBdev2", 00:28:19.002 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:19.002 "is_configured": true, 00:28:19.002 "data_offset": 256, 00:28:19.002 "data_size": 7936 00:28:19.002 } 00:28:19.002 ] 00:28:19.002 }' 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:19.002 16:04:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:19.941 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:20.201 "name": "raid_bdev1", 00:28:20.201 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:20.201 "strip_size_kb": 0, 00:28:20.201 "state": "online", 00:28:20.201 "raid_level": "raid1", 00:28:20.201 "superblock": true, 00:28:20.201 "num_base_bdevs": 2, 00:28:20.201 "num_base_bdevs_discovered": 2, 00:28:20.201 "num_base_bdevs_operational": 2, 00:28:20.201 "process": { 00:28:20.201 "type": "rebuild", 00:28:20.201 "target": "spare", 00:28:20.201 "progress": { 00:28:20.201 "blocks": 6912, 00:28:20.201 "percent": 87 00:28:20.201 } 00:28:20.201 }, 00:28:20.201 "base_bdevs_list": [ 00:28:20.201 { 00:28:20.201 "name": "spare", 00:28:20.201 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:20.201 "is_configured": true, 00:28:20.201 "data_offset": 256, 00:28:20.201 "data_size": 7936 00:28:20.201 }, 00:28:20.201 { 00:28:20.201 "name": "BaseBdev2", 00:28:20.201 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:20.201 "is_configured": true, 00:28:20.201 "data_offset": 256, 00:28:20.201 "data_size": 7936 00:28:20.201 } 00:28:20.201 ] 00:28:20.201 }' 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:20.201 16:04:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:28:20.461 [2024-07-12 16:04:40.847610] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:20.461 [2024-07-12 16:04:40.847655] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:20.461 [2024-07-12 16:04:40.847725] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.401 "name": "raid_bdev1", 00:28:21.401 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:21.401 "strip_size_kb": 0, 00:28:21.401 "state": "online", 00:28:21.401 "raid_level": "raid1", 00:28:21.401 "superblock": true, 00:28:21.401 "num_base_bdevs": 2, 00:28:21.401 "num_base_bdevs_discovered": 2, 00:28:21.401 "num_base_bdevs_operational": 2, 00:28:21.401 "base_bdevs_list": [ 00:28:21.401 { 00:28:21.401 "name": "spare", 00:28:21.401 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:21.401 "is_configured": true, 00:28:21.401 "data_offset": 256, 00:28:21.401 "data_size": 7936 00:28:21.401 }, 00:28:21.401 { 00:28:21.401 "name": "BaseBdev2", 00:28:21.401 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:21.401 "is_configured": true, 00:28:21.401 "data_offset": 256, 00:28:21.401 "data_size": 7936 00:28:21.401 } 00:28:21.401 ] 00:28:21.401 }' 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:21.401 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.660 16:04:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.660 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:21.660 "name": "raid_bdev1", 00:28:21.660 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:21.660 "strip_size_kb": 0, 00:28:21.660 "state": "online", 00:28:21.661 "raid_level": "raid1", 00:28:21.661 "superblock": true, 00:28:21.661 "num_base_bdevs": 2, 00:28:21.661 "num_base_bdevs_discovered": 2, 00:28:21.661 "num_base_bdevs_operational": 2, 00:28:21.661 "base_bdevs_list": [ 00:28:21.661 { 00:28:21.661 "name": "spare", 00:28:21.661 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:21.661 "is_configured": true, 00:28:21.661 "data_offset": 256, 00:28:21.661 "data_size": 7936 00:28:21.661 }, 00:28:21.661 { 00:28:21.661 "name": "BaseBdev2", 00:28:21.661 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:21.661 "is_configured": true, 00:28:21.661 "data_offset": 256, 00:28:21.661 "data_size": 7936 00:28:21.661 } 00:28:21.661 ] 00:28:21.661 }' 00:28:21.661 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:21.661 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:21.661 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.921 "name": "raid_bdev1", 00:28:21.921 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:21.921 "strip_size_kb": 0, 00:28:21.921 "state": "online", 00:28:21.921 "raid_level": "raid1", 00:28:21.921 "superblock": true, 00:28:21.921 "num_base_bdevs": 2, 00:28:21.921 "num_base_bdevs_discovered": 2, 00:28:21.921 "num_base_bdevs_operational": 2, 00:28:21.921 "base_bdevs_list": [ 00:28:21.921 { 00:28:21.921 "name": "spare", 00:28:21.921 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:21.921 "is_configured": true, 00:28:21.921 "data_offset": 256, 00:28:21.921 "data_size": 7936 00:28:21.921 }, 00:28:21.921 { 00:28:21.921 "name": "BaseBdev2", 00:28:21.921 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:21.921 "is_configured": true, 00:28:21.921 "data_offset": 256, 00:28:21.921 "data_size": 7936 00:28:21.921 } 00:28:21.921 ] 00:28:21.921 }' 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.921 16:04:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:22.860 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:22.860 [2024-07-12 16:04:43.281660] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:22.860 [2024-07-12 16:04:43.281681] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:22.860 [2024-07-12 16:04:43.281732] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:22.860 [2024-07-12 16:04:43.281776] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:22.860 [2024-07-12 16:04:43.281782] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x151ecc0 name raid_bdev1, state offline 00:28:22.860 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.860 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:28:23.119 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:23.119 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:28:23.120 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:23.120 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:23.379 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:23.638 [2024-07-12 16:04:43.839043] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:23.638 [2024-07-12 16:04:43.839070] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:23.638 [2024-07-12 16:04:43.839081] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1393760 00:28:23.638 [2024-07-12 16:04:43.839087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:23.638 [2024-07-12 16:04:43.840444] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:23.638 [2024-07-12 16:04:43.840467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:23.638 [2024-07-12 16:04:43.840508] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:23.638 [2024-07-12 16:04:43.840528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:23.638 [2024-07-12 16:04:43.840593] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:23.638 spare 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.638 16:04:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:23.638 [2024-07-12 16:04:43.940878] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13930b0 00:28:23.638 [2024-07-12 16:04:43.940887] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:23.638 [2024-07-12 16:04:43.940942] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1395ea0 00:28:23.638 [2024-07-12 16:04:43.941011] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13930b0 00:28:23.638 [2024-07-12 16:04:43.941015] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13930b0 00:28:23.638 [2024-07-12 16:04:43.941060] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:23.638 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.638 "name": "raid_bdev1", 00:28:23.638 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:23.638 "strip_size_kb": 0, 00:28:23.638 "state": "online", 00:28:23.638 "raid_level": "raid1", 00:28:23.638 "superblock": true, 00:28:23.638 "num_base_bdevs": 2, 00:28:23.638 "num_base_bdevs_discovered": 2, 00:28:23.638 "num_base_bdevs_operational": 2, 00:28:23.638 "base_bdevs_list": [ 00:28:23.638 { 00:28:23.638 "name": "spare", 00:28:23.638 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:23.638 "is_configured": true, 00:28:23.638 "data_offset": 256, 00:28:23.638 "data_size": 7936 00:28:23.638 }, 00:28:23.638 { 00:28:23.638 "name": "BaseBdev2", 00:28:23.638 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:23.638 "is_configured": true, 00:28:23.638 "data_offset": 256, 00:28:23.638 "data_size": 7936 00:28:23.638 } 00:28:23.638 ] 00:28:23.638 }' 00:28:23.638 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.638 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:24.576 "name": "raid_bdev1", 00:28:24.576 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:24.576 "strip_size_kb": 0, 00:28:24.576 "state": "online", 00:28:24.576 "raid_level": "raid1", 00:28:24.576 "superblock": true, 00:28:24.576 "num_base_bdevs": 2, 00:28:24.576 "num_base_bdevs_discovered": 2, 00:28:24.576 "num_base_bdevs_operational": 2, 00:28:24.576 "base_bdevs_list": [ 00:28:24.576 { 00:28:24.576 "name": "spare", 00:28:24.576 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:24.576 "is_configured": true, 00:28:24.576 "data_offset": 256, 00:28:24.576 "data_size": 7936 00:28:24.576 }, 00:28:24.576 { 00:28:24.576 "name": "BaseBdev2", 00:28:24.576 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:24.576 "is_configured": true, 00:28:24.576 "data_offset": 256, 00:28:24.576 "data_size": 7936 00:28:24.576 } 00:28:24.576 ] 00:28:24.576 }' 00:28:24.576 16:04:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:24.836 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:24.836 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:24.836 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:24.836 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:24.836 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:25.404 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:25.404 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:25.404 [2024-07-12 16:04:45.848253] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.667 16:04:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:25.667 16:04:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:25.667 "name": "raid_bdev1", 00:28:25.667 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:25.667 "strip_size_kb": 0, 00:28:25.667 "state": "online", 00:28:25.667 "raid_level": "raid1", 00:28:25.667 "superblock": true, 00:28:25.667 "num_base_bdevs": 2, 00:28:25.667 "num_base_bdevs_discovered": 1, 00:28:25.667 "num_base_bdevs_operational": 1, 00:28:25.667 "base_bdevs_list": [ 00:28:25.667 { 00:28:25.667 "name": null, 00:28:25.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:25.667 "is_configured": false, 00:28:25.667 "data_offset": 256, 00:28:25.667 "data_size": 7936 00:28:25.667 }, 00:28:25.667 { 00:28:25.667 "name": "BaseBdev2", 00:28:25.667 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:25.667 "is_configured": true, 00:28:25.667 "data_offset": 256, 00:28:25.667 "data_size": 7936 00:28:25.667 } 00:28:25.667 ] 00:28:25.667 }' 00:28:25.667 16:04:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:25.667 16:04:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:26.276 16:04:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:26.535 [2024-07-12 16:04:46.814706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:26.535 [2024-07-12 16:04:46.814820] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:26.535 [2024-07-12 16:04:46.814830] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:26.535 [2024-07-12 16:04:46.814847] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:26.535 [2024-07-12 16:04:46.817308] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x152ad90 00:28:26.535 [2024-07-12 16:04:46.818884] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:26.535 16:04:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.471 16:04:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:27.731 "name": "raid_bdev1", 00:28:27.731 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:27.731 "strip_size_kb": 0, 00:28:27.731 "state": "online", 00:28:27.731 "raid_level": "raid1", 00:28:27.731 "superblock": true, 00:28:27.731 "num_base_bdevs": 2, 00:28:27.731 "num_base_bdevs_discovered": 2, 00:28:27.731 "num_base_bdevs_operational": 2, 00:28:27.731 "process": { 00:28:27.731 "type": "rebuild", 00:28:27.731 "target": "spare", 00:28:27.731 "progress": { 00:28:27.731 "blocks": 2816, 00:28:27.731 "percent": 35 00:28:27.731 } 00:28:27.731 }, 00:28:27.731 "base_bdevs_list": [ 00:28:27.731 { 00:28:27.731 "name": "spare", 00:28:27.731 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:27.731 "is_configured": true, 00:28:27.731 "data_offset": 256, 00:28:27.731 "data_size": 7936 00:28:27.731 }, 00:28:27.731 { 00:28:27.731 "name": "BaseBdev2", 00:28:27.731 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:27.731 "is_configured": true, 00:28:27.731 "data_offset": 256, 00:28:27.731 "data_size": 7936 00:28:27.731 } 00:28:27.731 ] 00:28:27.731 }' 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:27.731 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:27.991 [2024-07-12 16:04:48.303776] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:27.991 [2024-07-12 16:04:48.327789] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:27.991 [2024-07-12 16:04:48.327818] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:27.991 [2024-07-12 16:04:48.327828] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:27.991 [2024-07-12 16:04:48.327832] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:27.991 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:28.251 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.251 "name": "raid_bdev1", 00:28:28.251 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:28.251 "strip_size_kb": 0, 00:28:28.251 "state": "online", 00:28:28.251 "raid_level": "raid1", 00:28:28.251 "superblock": true, 00:28:28.251 "num_base_bdevs": 2, 00:28:28.251 "num_base_bdevs_discovered": 1, 00:28:28.251 "num_base_bdevs_operational": 1, 00:28:28.251 "base_bdevs_list": [ 00:28:28.251 { 00:28:28.251 "name": null, 00:28:28.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.251 "is_configured": false, 00:28:28.251 "data_offset": 256, 00:28:28.251 "data_size": 7936 00:28:28.251 }, 00:28:28.251 { 00:28:28.251 "name": "BaseBdev2", 00:28:28.251 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:28.251 "is_configured": true, 00:28:28.251 "data_offset": 256, 00:28:28.251 "data_size": 7936 00:28:28.251 } 00:28:28.251 ] 00:28:28.251 }' 00:28:28.251 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.251 16:04:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:28.819 16:04:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:29.079 [2024-07-12 16:04:49.298271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:29.079 [2024-07-12 16:04:49.298307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:29.079 [2024-07-12 16:04:49.298320] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1395bd0 00:28:29.079 [2024-07-12 16:04:49.298331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:29.079 [2024-07-12 16:04:49.298485] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:29.079 [2024-07-12 16:04:49.298495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:29.079 [2024-07-12 16:04:49.298531] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:29.079 [2024-07-12 16:04:49.298537] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:29.079 [2024-07-12 16:04:49.298543] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:29.079 [2024-07-12 16:04:49.298553] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:29.079 [2024-07-12 16:04:49.300883] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1396690 00:28:29.079 [2024-07-12 16:04:49.302008] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:29.079 spare 00:28:29.079 16:04:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.018 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:30.277 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:30.277 "name": "raid_bdev1", 00:28:30.277 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:30.277 "strip_size_kb": 0, 00:28:30.277 "state": "online", 00:28:30.277 "raid_level": "raid1", 00:28:30.277 "superblock": true, 00:28:30.277 "num_base_bdevs": 2, 00:28:30.277 "num_base_bdevs_discovered": 2, 00:28:30.277 "num_base_bdevs_operational": 2, 00:28:30.277 "process": { 00:28:30.277 "type": "rebuild", 00:28:30.277 "target": "spare", 00:28:30.277 "progress": { 00:28:30.278 "blocks": 2816, 00:28:30.278 "percent": 35 00:28:30.278 } 00:28:30.278 }, 00:28:30.278 "base_bdevs_list": [ 00:28:30.278 { 00:28:30.278 "name": "spare", 00:28:30.278 "uuid": "e1003fd1-51fe-5eaf-9f22-7f30077d3b5c", 00:28:30.278 "is_configured": true, 00:28:30.278 "data_offset": 256, 00:28:30.278 "data_size": 7936 00:28:30.278 }, 00:28:30.278 { 00:28:30.278 "name": "BaseBdev2", 00:28:30.278 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:30.278 "is_configured": true, 00:28:30.278 "data_offset": 256, 00:28:30.278 "data_size": 7936 00:28:30.278 } 00:28:30.278 ] 00:28:30.278 }' 00:28:30.278 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:30.278 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:30.278 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:30.278 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:30.278 16:04:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:30.847 [2024-07-12 16:04:51.171878] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.847 [2024-07-12 16:04:51.213118] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:30.847 [2024-07-12 16:04:51.213149] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:30.847 [2024-07-12 16:04:51.213158] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:30.847 [2024-07-12 16:04:51.213163] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.847 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.108 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:31.108 "name": "raid_bdev1", 00:28:31.108 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:31.108 "strip_size_kb": 0, 00:28:31.108 "state": "online", 00:28:31.108 "raid_level": "raid1", 00:28:31.108 "superblock": true, 00:28:31.108 "num_base_bdevs": 2, 00:28:31.108 "num_base_bdevs_discovered": 1, 00:28:31.108 "num_base_bdevs_operational": 1, 00:28:31.108 "base_bdevs_list": [ 00:28:31.108 { 00:28:31.108 "name": null, 00:28:31.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.108 "is_configured": false, 00:28:31.108 "data_offset": 256, 00:28:31.108 "data_size": 7936 00:28:31.108 }, 00:28:31.108 { 00:28:31.108 "name": "BaseBdev2", 00:28:31.108 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:31.108 "is_configured": true, 00:28:31.108 "data_offset": 256, 00:28:31.108 "data_size": 7936 00:28:31.108 } 00:28:31.108 ] 00:28:31.108 }' 00:28:31.108 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:31.108 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:31.678 16:04:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:31.937 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:31.937 "name": "raid_bdev1", 00:28:31.937 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:31.937 "strip_size_kb": 0, 00:28:31.937 "state": "online", 00:28:31.937 "raid_level": "raid1", 00:28:31.937 "superblock": true, 00:28:31.937 "num_base_bdevs": 2, 00:28:31.937 "num_base_bdevs_discovered": 1, 00:28:31.937 "num_base_bdevs_operational": 1, 00:28:31.937 "base_bdevs_list": [ 00:28:31.937 { 00:28:31.937 "name": null, 00:28:31.937 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:31.937 "is_configured": false, 00:28:31.937 "data_offset": 256, 00:28:31.937 "data_size": 7936 00:28:31.937 }, 00:28:31.937 { 00:28:31.937 "name": "BaseBdev2", 00:28:31.937 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:31.937 "is_configured": true, 00:28:31.937 "data_offset": 256, 00:28:31.937 "data_size": 7936 00:28:31.937 } 00:28:31.937 ] 00:28:31.937 }' 00:28:31.937 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:31.937 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:31.938 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:31.938 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:31.938 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:32.197 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:32.197 [2024-07-12 16:04:52.643986] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:32.197 [2024-07-12 16:04:52.644018] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:32.197 [2024-07-12 16:04:52.644034] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x139b760 00:28:32.197 [2024-07-12 16:04:52.644040] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:32.197 [2024-07-12 16:04:52.644174] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:32.197 [2024-07-12 16:04:52.644183] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:32.197 [2024-07-12 16:04:52.644213] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:32.197 [2024-07-12 16:04:52.644220] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:32.197 [2024-07-12 16:04:52.644225] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:32.457 BaseBdev1 00:28:32.457 16:04:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:33.397 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:33.398 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:33.398 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:33.657 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:33.657 "name": "raid_bdev1", 00:28:33.657 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:33.657 "strip_size_kb": 0, 00:28:33.657 "state": "online", 00:28:33.657 "raid_level": "raid1", 00:28:33.657 "superblock": true, 00:28:33.657 "num_base_bdevs": 2, 00:28:33.657 "num_base_bdevs_discovered": 1, 00:28:33.657 "num_base_bdevs_operational": 1, 00:28:33.657 "base_bdevs_list": [ 00:28:33.657 { 00:28:33.657 "name": null, 00:28:33.657 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:33.657 "is_configured": false, 00:28:33.657 "data_offset": 256, 00:28:33.657 "data_size": 7936 00:28:33.657 }, 00:28:33.657 { 00:28:33.657 "name": "BaseBdev2", 00:28:33.657 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:33.657 "is_configured": true, 00:28:33.657 "data_offset": 256, 00:28:33.657 "data_size": 7936 00:28:33.657 } 00:28:33.657 ] 00:28:33.657 }' 00:28:33.657 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:33.657 16:04:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:34.226 "name": "raid_bdev1", 00:28:34.226 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:34.226 "strip_size_kb": 0, 00:28:34.226 "state": "online", 00:28:34.226 "raid_level": "raid1", 00:28:34.226 "superblock": true, 00:28:34.226 "num_base_bdevs": 2, 00:28:34.226 "num_base_bdevs_discovered": 1, 00:28:34.226 "num_base_bdevs_operational": 1, 00:28:34.226 "base_bdevs_list": [ 00:28:34.226 { 00:28:34.226 "name": null, 00:28:34.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:34.226 "is_configured": false, 00:28:34.226 "data_offset": 256, 00:28:34.226 "data_size": 7936 00:28:34.226 }, 00:28:34.226 { 00:28:34.226 "name": "BaseBdev2", 00:28:34.226 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:34.226 "is_configured": true, 00:28:34.226 "data_offset": 256, 00:28:34.226 "data_size": 7936 00:28:34.226 } 00:28:34.226 ] 00:28:34.226 }' 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:34.226 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:34.486 [2024-07-12 16:04:54.869622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:34.486 [2024-07-12 16:04:54.869717] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:34.486 [2024-07-12 16:04:54.869725] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:34.486 request: 00:28:34.486 { 00:28:34.486 "base_bdev": "BaseBdev1", 00:28:34.486 "raid_bdev": "raid_bdev1", 00:28:34.486 "method": "bdev_raid_add_base_bdev", 00:28:34.486 "req_id": 1 00:28:34.486 } 00:28:34.486 Got JSON-RPC error response 00:28:34.486 response: 00:28:34.486 { 00:28:34.486 "code": -22, 00:28:34.486 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:34.486 } 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:34.486 16:04:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:35.870 16:04:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.870 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.870 "name": "raid_bdev1", 00:28:35.870 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:35.870 "strip_size_kb": 0, 00:28:35.870 "state": "online", 00:28:35.870 "raid_level": "raid1", 00:28:35.870 "superblock": true, 00:28:35.870 "num_base_bdevs": 2, 00:28:35.870 "num_base_bdevs_discovered": 1, 00:28:35.870 "num_base_bdevs_operational": 1, 00:28:35.870 "base_bdevs_list": [ 00:28:35.870 { 00:28:35.870 "name": null, 00:28:35.870 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:35.870 "is_configured": false, 00:28:35.870 "data_offset": 256, 00:28:35.870 "data_size": 7936 00:28:35.870 }, 00:28:35.870 { 00:28:35.870 "name": "BaseBdev2", 00:28:35.870 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:35.870 "is_configured": true, 00:28:35.870 "data_offset": 256, 00:28:35.870 "data_size": 7936 00:28:35.870 } 00:28:35.870 ] 00:28:35.870 }' 00:28:35.870 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.870 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:36.812 16:04:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:36.812 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:36.812 "name": "raid_bdev1", 00:28:36.812 "uuid": "12c782c9-9cc1-4d66-97ed-c5dbbe707393", 00:28:36.812 "strip_size_kb": 0, 00:28:36.812 "state": "online", 00:28:36.812 "raid_level": "raid1", 00:28:36.812 "superblock": true, 00:28:36.812 "num_base_bdevs": 2, 00:28:36.812 "num_base_bdevs_discovered": 1, 00:28:36.812 "num_base_bdevs_operational": 1, 00:28:36.812 "base_bdevs_list": [ 00:28:36.812 { 00:28:36.812 "name": null, 00:28:36.812 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:36.812 "is_configured": false, 00:28:36.812 "data_offset": 256, 00:28:36.812 "data_size": 7936 00:28:36.812 }, 00:28:36.812 { 00:28:36.812 "name": "BaseBdev2", 00:28:36.812 "uuid": "bb304092-c65c-515a-9851-9f3f6812f771", 00:28:36.812 "is_configured": true, 00:28:36.812 "data_offset": 256, 00:28:36.812 "data_size": 7936 00:28:36.812 } 00:28:36.812 ] 00:28:36.812 }' 00:28:36.812 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:36.812 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:36.812 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2681822 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 2681822 ']' 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 2681822 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2681822 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2681822' 00:28:37.073 killing process with pid 2681822 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 2681822 00:28:37.073 Received shutdown signal, test time was about 60.000000 seconds 00:28:37.073 00:28:37.073 Latency(us) 00:28:37.073 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:37.073 =================================================================================================================== 00:28:37.073 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:37.073 [2024-07-12 16:04:57.315177] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:37.073 [2024-07-12 16:04:57.315246] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:37.073 [2024-07-12 16:04:57.315278] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:37.073 [2024-07-12 16:04:57.315284] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13930b0 name raid_bdev1, state offline 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 2681822 00:28:37.073 [2024-07-12 16:04:57.330958] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:28:37.073 00:28:37.073 real 0m28.014s 00:28:37.073 user 0m45.375s 00:28:37.073 sys 0m2.945s 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:37.073 16:04:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:37.073 ************************************ 00:28:37.073 END TEST raid_rebuild_test_sb_md_interleaved 00:28:37.073 ************************************ 00:28:37.073 16:04:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:37.073 16:04:57 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:28:37.073 16:04:57 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:28:37.073 16:04:57 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2681822 ']' 00:28:37.073 16:04:57 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2681822 00:28:37.334 16:04:57 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:28:37.334 00:28:37.334 real 16m52.856s 00:28:37.334 user 29m9.900s 00:28:37.334 sys 2m25.720s 00:28:37.334 16:04:57 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:37.334 16:04:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:37.334 ************************************ 00:28:37.334 END TEST bdev_raid 00:28:37.334 ************************************ 00:28:37.334 16:04:57 -- common/autotest_common.sh@1142 -- # return 0 00:28:37.334 16:04:57 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:37.334 16:04:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:37.334 16:04:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:37.334 16:04:57 -- common/autotest_common.sh@10 -- # set +x 00:28:37.334 ************************************ 00:28:37.334 START TEST bdevperf_config 00:28:37.334 ************************************ 00:28:37.334 16:04:57 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:37.334 * Looking for test storage... 00:28:37.334 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:37.334 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:37.334 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:37.334 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:28:37.334 16:04:57 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:37.335 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:37.335 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:37.335 16:04:57 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:39.876 16:05:00 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-12 16:04:57.795805] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:39.876 [2024-07-12 16:04:57.795873] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686931 ] 00:28:39.876 Using job config with 4 jobs 00:28:39.876 [2024-07-12 16:04:57.898337] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.876 [2024-07-12 16:04:57.972637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:39.876 cpumask for '\''job0'\'' is too big 00:28:39.876 cpumask for '\''job1'\'' is too big 00:28:39.876 cpumask for '\''job2'\'' is too big 00:28:39.876 cpumask for '\''job3'\'' is too big 00:28:39.876 Running I/O for 2 seconds... 00:28:39.876 00:28:39.876 Latency(us) 00:28:39.876 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28436.08 27.77 0.00 0.00 8992.33 1606.89 14014.62 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28413.87 27.75 0.00 0.00 8983.44 1587.99 12451.84 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28391.84 27.73 0.00 0.00 8972.50 1600.59 10838.65 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28369.78 27.70 0.00 0.00 8961.60 1587.99 9427.10 00:28:39.876 =================================================================================================================== 00:28:39.876 Total : 113611.57 110.95 0.00 0.00 8977.47 1587.99 14014.62' 00:28:39.876 16:05:00 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-12 16:04:57.795805] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:39.876 [2024-07-12 16:04:57.795873] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686931 ] 00:28:39.876 Using job config with 4 jobs 00:28:39.876 [2024-07-12 16:04:57.898337] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.876 [2024-07-12 16:04:57.972637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:39.876 cpumask for '\''job0'\'' is too big 00:28:39.876 cpumask for '\''job1'\'' is too big 00:28:39.876 cpumask for '\''job2'\'' is too big 00:28:39.876 cpumask for '\''job3'\'' is too big 00:28:39.876 Running I/O for 2 seconds... 00:28:39.876 00:28:39.876 Latency(us) 00:28:39.876 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28436.08 27.77 0.00 0.00 8992.33 1606.89 14014.62 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28413.87 27.75 0.00 0.00 8983.44 1587.99 12451.84 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28391.84 27.73 0.00 0.00 8972.50 1600.59 10838.65 00:28:39.876 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.876 Malloc0 : 2.02 28369.78 27.70 0.00 0.00 8961.60 1587.99 9427.10 00:28:39.876 =================================================================================================================== 00:28:39.876 Total : 113611.57 110.95 0.00 0.00 8977.47 1587.99 14014.62' 00:28:39.876 16:05:00 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 16:04:57.795805] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:39.877 [2024-07-12 16:04:57.795873] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686931 ] 00:28:39.877 Using job config with 4 jobs 00:28:39.877 [2024-07-12 16:04:57.898337] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:39.877 [2024-07-12 16:04:57.972637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:39.877 cpumask for '\''job0'\'' is too big 00:28:39.877 cpumask for '\''job1'\'' is too big 00:28:39.877 cpumask for '\''job2'\'' is too big 00:28:39.877 cpumask for '\''job3'\'' is too big 00:28:39.877 Running I/O for 2 seconds... 00:28:39.877 00:28:39.877 Latency(us) 00:28:39.877 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:39.877 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.877 Malloc0 : 2.02 28436.08 27.77 0.00 0.00 8992.33 1606.89 14014.62 00:28:39.877 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.877 Malloc0 : 2.02 28413.87 27.75 0.00 0.00 8983.44 1587.99 12451.84 00:28:39.877 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.877 Malloc0 : 2.02 28391.84 27.73 0.00 0.00 8972.50 1600.59 10838.65 00:28:39.877 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:39.877 Malloc0 : 2.02 28369.78 27.70 0.00 0.00 8961.60 1587.99 9427.10 00:28:39.877 =================================================================================================================== 00:28:39.877 Total : 113611.57 110.95 0.00 0.00 8977.47 1587.99 14014.62' 00:28:39.877 16:05:00 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:39.877 16:05:00 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:39.877 16:05:00 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:28:39.877 16:05:00 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:39.877 [2024-07-12 16:05:00.307367] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:39.877 [2024-07-12 16:05:00.307418] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687250 ] 00:28:40.137 [2024-07-12 16:05:00.410034] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:40.137 [2024-07-12 16:05:00.484504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:40.137 cpumask for 'job0' is too big 00:28:40.137 cpumask for 'job1' is too big 00:28:40.137 cpumask for 'job2' is too big 00:28:40.137 cpumask for 'job3' is too big 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:28:42.680 Running I/O for 2 seconds... 00:28:42.680 00:28:42.680 Latency(us) 00:28:42.680 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:42.680 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:42.680 Malloc0 : 2.01 28660.51 27.99 0.00 0.00 8923.29 1625.80 13812.97 00:28:42.680 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:42.680 Malloc0 : 2.02 28669.00 28.00 0.00 0.00 8902.99 1587.99 12250.19 00:28:42.680 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:42.680 Malloc0 : 2.02 28646.80 27.98 0.00 0.00 8892.23 1606.89 10687.41 00:28:42.680 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:42.680 Malloc0 : 2.02 28624.59 27.95 0.00 0.00 8883.80 1569.08 9175.04 00:28:42.680 =================================================================================================================== 00:28:42.680 Total : 114600.91 111.91 0.00 0.00 8900.55 1569.08 13812.97' 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:42.680 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:42.680 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:42.680 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:42.680 16:05:02 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:45.221 16:05:05 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-12 16:05:02.820619] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:45.221 [2024-07-12 16:05:02.820673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687653 ] 00:28:45.221 Using job config with 3 jobs 00:28:45.221 [2024-07-12 16:05:02.931957] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.221 [2024-07-12 16:05:03.010275] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.221 cpumask for '\''job0'\'' is too big 00:28:45.221 cpumask for '\''job1'\'' is too big 00:28:45.221 cpumask for '\''job2'\'' is too big 00:28:45.221 Running I/O for 2 seconds... 00:28:45.221 00:28:45.221 Latency(us) 00:28:45.221 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.01 38486.20 37.58 0.00 0.00 6638.94 1613.19 9830.40 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.01 38497.37 37.60 0.00 0.00 6624.16 1531.27 8318.03 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.02 38467.53 37.57 0.00 0.00 6616.44 1543.88 6856.07 00:28:45.221 =================================================================================================================== 00:28:45.221 Total : 115451.09 112.75 0.00 0.00 6626.50 1531.27 9830.40' 00:28:45.221 16:05:05 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-12 16:05:02.820619] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:45.221 [2024-07-12 16:05:02.820673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687653 ] 00:28:45.221 Using job config with 3 jobs 00:28:45.221 [2024-07-12 16:05:02.931957] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.221 [2024-07-12 16:05:03.010275] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.221 cpumask for '\''job0'\'' is too big 00:28:45.221 cpumask for '\''job1'\'' is too big 00:28:45.221 cpumask for '\''job2'\'' is too big 00:28:45.221 Running I/O for 2 seconds... 00:28:45.221 00:28:45.221 Latency(us) 00:28:45.221 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.01 38486.20 37.58 0.00 0.00 6638.94 1613.19 9830.40 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.01 38497.37 37.60 0.00 0.00 6624.16 1531.27 8318.03 00:28:45.221 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.221 Malloc0 : 2.02 38467.53 37.57 0.00 0.00 6616.44 1543.88 6856.07 00:28:45.221 =================================================================================================================== 00:28:45.221 Total : 115451.09 112.75 0.00 0.00 6626.50 1531.27 9830.40' 00:28:45.221 16:05:05 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 16:05:02.820619] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:45.221 [2024-07-12 16:05:02.820673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687653 ] 00:28:45.221 Using job config with 3 jobs 00:28:45.221 [2024-07-12 16:05:02.931957] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:45.221 [2024-07-12 16:05:03.010275] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:45.221 cpumask for '\''job0'\'' is too big 00:28:45.221 cpumask for '\''job1'\'' is too big 00:28:45.222 cpumask for '\''job2'\'' is too big 00:28:45.222 Running I/O for 2 seconds... 00:28:45.222 00:28:45.222 Latency(us) 00:28:45.222 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:45.222 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.222 Malloc0 : 2.01 38486.20 37.58 0.00 0.00 6638.94 1613.19 9830.40 00:28:45.222 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.222 Malloc0 : 2.01 38497.37 37.60 0.00 0.00 6624.16 1531.27 8318.03 00:28:45.222 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:45.222 Malloc0 : 2.02 38467.53 37.57 0.00 0.00 6616.44 1543.88 6856.07 00:28:45.222 =================================================================================================================== 00:28:45.222 Total : 115451.09 112.75 0.00 0.00 6626.50 1531.27 9830.40' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:45.222 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:45.222 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:45.222 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:45.222 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:45.222 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:45.222 16:05:05 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:47.835 16:05:07 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-12 16:05:05.377022] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:47.835 [2024-07-12 16:05:05.377078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688173 ] 00:28:47.835 Using job config with 4 jobs 00:28:47.835 [2024-07-12 16:05:05.476074] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:47.835 [2024-07-12 16:05:05.563097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:47.835 cpumask for '\''job0'\'' is too big 00:28:47.835 cpumask for '\''job1'\'' is too big 00:28:47.835 cpumask for '\''job2'\'' is too big 00:28:47.835 cpumask for '\''job3'\'' is too big 00:28:47.835 Running I/O for 2 seconds... 00:28:47.835 00:28:47.835 Latency(us) 00:28:47.835 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14133.19 13.80 0.00 0.00 18096.26 3251.59 28029.24 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14121.92 13.79 0.00 0.00 18095.53 3906.95 28029.24 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14110.96 13.78 0.00 0.00 18053.24 3213.78 24802.86 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14099.80 13.77 0.00 0.00 18054.28 3881.75 24802.86 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.04 14088.87 13.76 0.00 0.00 18014.01 3251.59 21475.64 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.04 14077.69 13.75 0.00 0.00 18016.13 3906.95 21475.64 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.04 14160.53 13.83 0.00 0.00 17856.39 3075.15 18450.90 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.04 14149.36 13.82 0.00 0.00 17856.15 2419.79 18450.90 00:28:47.835 =================================================================================================================== 00:28:47.835 Total : 112942.32 110.30 0.00 0.00 18004.92 2419.79 28029.24' 00:28:47.835 16:05:07 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-12 16:05:05.377022] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:47.835 [2024-07-12 16:05:05.377078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688173 ] 00:28:47.835 Using job config with 4 jobs 00:28:47.835 [2024-07-12 16:05:05.476074] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:47.835 [2024-07-12 16:05:05.563097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:47.835 cpumask for '\''job0'\'' is too big 00:28:47.835 cpumask for '\''job1'\'' is too big 00:28:47.835 cpumask for '\''job2'\'' is too big 00:28:47.835 cpumask for '\''job3'\'' is too big 00:28:47.835 Running I/O for 2 seconds... 00:28:47.835 00:28:47.835 Latency(us) 00:28:47.835 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14133.19 13.80 0.00 0.00 18096.26 3251.59 28029.24 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14121.92 13.79 0.00 0.00 18095.53 3906.95 28029.24 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14110.96 13.78 0.00 0.00 18053.24 3213.78 24802.86 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14099.80 13.77 0.00 0.00 18054.28 3881.75 24802.86 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.04 14088.87 13.76 0.00 0.00 18014.01 3251.59 21475.64 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.04 14077.69 13.75 0.00 0.00 18016.13 3906.95 21475.64 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.04 14160.53 13.83 0.00 0.00 17856.39 3075.15 18450.90 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.04 14149.36 13.82 0.00 0.00 17856.15 2419.79 18450.90 00:28:47.835 =================================================================================================================== 00:28:47.835 Total : 112942.32 110.30 0.00 0.00 18004.92 2419.79 28029.24' 00:28:47.835 16:05:07 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-12 16:05:05.377022] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:47.835 [2024-07-12 16:05:05.377078] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688173 ] 00:28:47.835 Using job config with 4 jobs 00:28:47.835 [2024-07-12 16:05:05.476074] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:47.835 [2024-07-12 16:05:05.563097] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:47.835 cpumask for '\''job0'\'' is too big 00:28:47.835 cpumask for '\''job1'\'' is too big 00:28:47.835 cpumask for '\''job2'\'' is too big 00:28:47.835 cpumask for '\''job3'\'' is too big 00:28:47.835 Running I/O for 2 seconds... 00:28:47.835 00:28:47.835 Latency(us) 00:28:47.835 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14133.19 13.80 0.00 0.00 18096.26 3251.59 28029.24 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14121.92 13.79 0.00 0.00 18095.53 3906.95 28029.24 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.03 14110.96 13.78 0.00 0.00 18053.24 3213.78 24802.86 00:28:47.835 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc1 : 2.03 14099.80 13.77 0.00 0.00 18054.28 3881.75 24802.86 00:28:47.835 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.835 Malloc0 : 2.04 14088.87 13.76 0.00 0.00 18014.01 3251.59 21475.64 00:28:47.836 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.836 Malloc1 : 2.04 14077.69 13.75 0.00 0.00 18016.13 3906.95 21475.64 00:28:47.836 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.836 Malloc0 : 2.04 14160.53 13.83 0.00 0.00 17856.39 3075.15 18450.90 00:28:47.836 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:47.836 Malloc1 : 2.04 14149.36 13.82 0.00 0.00 17856.15 2419.79 18450.90 00:28:47.836 =================================================================================================================== 00:28:47.836 Total : 112942.32 110.30 0.00 0.00 18004.92 2419.79 28029.24' 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:47.836 16:05:07 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:28:47.836 00:28:47.836 real 0m10.249s 00:28:47.836 user 0m9.256s 00:28:47.836 sys 0m0.841s 00:28:47.836 16:05:07 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:47.836 16:05:07 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:28:47.836 ************************************ 00:28:47.836 END TEST bdevperf_config 00:28:47.836 ************************************ 00:28:47.836 16:05:07 -- common/autotest_common.sh@1142 -- # return 0 00:28:47.836 16:05:07 -- spdk/autotest.sh@192 -- # uname -s 00:28:47.836 16:05:07 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:28:47.836 16:05:07 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:47.836 16:05:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:47.836 16:05:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:47.836 16:05:07 -- common/autotest_common.sh@10 -- # set +x 00:28:47.836 ************************************ 00:28:47.836 START TEST reactor_set_interrupt 00:28:47.836 ************************************ 00:28:47.836 16:05:07 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:47.836 * Looking for test storage... 00:28:47.836 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:47.836 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:47.836 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:47.836 16:05:08 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:47.837 16:05:08 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:47.837 16:05:08 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:47.837 #define SPDK_CONFIG_H 00:28:47.837 #define SPDK_CONFIG_APPS 1 00:28:47.837 #define SPDK_CONFIG_ARCH native 00:28:47.837 #undef SPDK_CONFIG_ASAN 00:28:47.837 #undef SPDK_CONFIG_AVAHI 00:28:47.837 #undef SPDK_CONFIG_CET 00:28:47.837 #define SPDK_CONFIG_COVERAGE 1 00:28:47.837 #define SPDK_CONFIG_CROSS_PREFIX 00:28:47.837 #define SPDK_CONFIG_CRYPTO 1 00:28:47.837 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:47.837 #undef SPDK_CONFIG_CUSTOMOCF 00:28:47.837 #undef SPDK_CONFIG_DAOS 00:28:47.837 #define SPDK_CONFIG_DAOS_DIR 00:28:47.837 #define SPDK_CONFIG_DEBUG 1 00:28:47.837 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:47.837 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:47.837 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:47.837 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:47.837 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:47.837 #undef SPDK_CONFIG_DPDK_UADK 00:28:47.837 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:47.837 #define SPDK_CONFIG_EXAMPLES 1 00:28:47.837 #undef SPDK_CONFIG_FC 00:28:47.837 #define SPDK_CONFIG_FC_PATH 00:28:47.837 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:47.837 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:47.837 #undef SPDK_CONFIG_FUSE 00:28:47.837 #undef SPDK_CONFIG_FUZZER 00:28:47.837 #define SPDK_CONFIG_FUZZER_LIB 00:28:47.837 #undef SPDK_CONFIG_GOLANG 00:28:47.837 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:47.837 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:47.837 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:47.837 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:47.837 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:47.837 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:47.837 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:47.837 #define SPDK_CONFIG_IDXD 1 00:28:47.837 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:47.837 #define SPDK_CONFIG_IPSEC_MB 1 00:28:47.837 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:47.837 #define SPDK_CONFIG_ISAL 1 00:28:47.837 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:47.837 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:47.837 #define SPDK_CONFIG_LIBDIR 00:28:47.837 #undef SPDK_CONFIG_LTO 00:28:47.837 #define SPDK_CONFIG_MAX_LCORES 128 00:28:47.837 #define SPDK_CONFIG_NVME_CUSE 1 00:28:47.837 #undef SPDK_CONFIG_OCF 00:28:47.837 #define SPDK_CONFIG_OCF_PATH 00:28:47.837 #define SPDK_CONFIG_OPENSSL_PATH 00:28:47.837 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:47.837 #define SPDK_CONFIG_PGO_DIR 00:28:47.837 #undef SPDK_CONFIG_PGO_USE 00:28:47.837 #define SPDK_CONFIG_PREFIX /usr/local 00:28:47.837 #undef SPDK_CONFIG_RAID5F 00:28:47.837 #undef SPDK_CONFIG_RBD 00:28:47.837 #define SPDK_CONFIG_RDMA 1 00:28:47.837 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:47.837 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:47.837 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:47.837 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:47.837 #define SPDK_CONFIG_SHARED 1 00:28:47.837 #undef SPDK_CONFIG_SMA 00:28:47.837 #define SPDK_CONFIG_TESTS 1 00:28:47.837 #undef SPDK_CONFIG_TSAN 00:28:47.837 #define SPDK_CONFIG_UBLK 1 00:28:47.837 #define SPDK_CONFIG_UBSAN 1 00:28:47.837 #undef SPDK_CONFIG_UNIT_TESTS 00:28:47.837 #undef SPDK_CONFIG_URING 00:28:47.837 #define SPDK_CONFIG_URING_PATH 00:28:47.837 #undef SPDK_CONFIG_URING_ZNS 00:28:47.837 #undef SPDK_CONFIG_USDT 00:28:47.837 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:47.837 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:47.837 #undef SPDK_CONFIG_VFIO_USER 00:28:47.837 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:47.837 #define SPDK_CONFIG_VHOST 1 00:28:47.837 #define SPDK_CONFIG_VIRTIO 1 00:28:47.837 #undef SPDK_CONFIG_VTUNE 00:28:47.837 #define SPDK_CONFIG_VTUNE_DIR 00:28:47.837 #define SPDK_CONFIG_WERROR 1 00:28:47.837 #define SPDK_CONFIG_WPDK_DIR 00:28:47.837 #undef SPDK_CONFIG_XNVME 00:28:47.837 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:47.837 16:05:08 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:47.837 16:05:08 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:47.837 16:05:08 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:47.837 16:05:08 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:47.837 16:05:08 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:28:47.837 16:05:08 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:47.837 16:05:08 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:47.837 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:47.838 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2688542 ]] 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2688542 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.bna0b4 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.bna0b4/tests/interrupt /tmp/spdk.bna0b4 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=123809849344 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=5566443520 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=25865379840 00:28:47.839 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9879552 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=64687697920 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=450560 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:28:47.840 * Looking for test storage... 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=123809849344 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=7781036032 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.840 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2688594 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2688594 /var/tmp/spdk.sock 00:28:47.840 16:05:08 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2688594 ']' 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:47.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:47.840 16:05:08 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:47.840 [2024-07-12 16:05:08.265341] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:47.840 [2024-07-12 16:05:08.265400] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2688594 ] 00:28:48.101 [2024-07-12 16:05:08.359988] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:48.101 [2024-07-12 16:05:08.437477] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:48.101 [2024-07-12 16:05:08.437617] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.101 [2024-07-12 16:05:08.437619] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:48.101 [2024-07-12 16:05:08.489791] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:49.041 16:05:09 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:49.041 16:05:09 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:49.041 Malloc0 00:28:49.041 Malloc1 00:28:49.041 Malloc2 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:49.041 5000+0 records in 00:28:49.041 5000+0 records out 00:28:49.041 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0166398 s, 615 MB/s 00:28:49.041 16:05:09 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:49.301 AIO0 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2688594 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2688594 without_thd 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2688594 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:49.301 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:49.561 spdk_thread ids are 1 on reactor0. 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2688594 0 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2688594 0 idle 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:49.561 16:05:09 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688594 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.31 reactor_0' 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688594 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.31 reactor_0 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2688594 1 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2688594 1 idle 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:49.821 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688639 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688639 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2688594 2 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2688594 2 idle 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688640 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688640 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:28:50.082 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:28:50.342 [2024-07-12 16:05:10.674795] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:50.342 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:50.603 [2024-07-12 16:05:10.894305] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:50.603 [2024-07-12 16:05:10.894727] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:50.603 16:05:10 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:50.863 [2024-07-12 16:05:11.098258] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:50.863 [2024-07-12 16:05:11.098467] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2688594 0 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2688594 0 busy 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688594 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.72 reactor_0' 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688594 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.72 reactor_0 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2688594 2 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2688594 2 busy 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:50.863 16:05:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688640 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.36 reactor_2' 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688640 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.36 reactor_2 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:51.123 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:51.383 [2024-07-12 16:05:11.662262] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:51.383 [2024-07-12 16:05:11.662370] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2688594 2 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2688594 2 idle 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:51.383 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688640 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.56 reactor_2' 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688640 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.56 reactor_2 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:51.644 16:05:11 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:51.644 [2024-07-12 16:05:12.050260] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:51.644 [2024-07-12 16:05:12.050465] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:51.644 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:28:51.644 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:28:51.644 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:28:51.904 [2024-07-12 16:05:12.262561] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2688594 0 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2688594 0 idle 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2688594 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:51.904 16:05:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2688594 -w 256 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2688594 root 20 0 128.2g 34816 22528 S 6.7 0.0 0:01.48 reactor_0' 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2688594 root 20 0 128.2g 34816 22528 S 6.7 0.0 0:01.48 reactor_0 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:28:52.165 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2688594 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2688594 ']' 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2688594 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2688594 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2688594' 00:28:52.165 killing process with pid 2688594 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2688594 00:28:52.165 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2688594 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2689548 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2689548 /var/tmp/spdk.sock 00:28:52.425 16:05:12 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 2689548 ']' 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:52.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:52.425 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:52.425 [2024-07-12 16:05:12.737280] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:52.426 [2024-07-12 16:05:12.737331] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2689548 ] 00:28:52.426 [2024-07-12 16:05:12.804919] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:52.426 [2024-07-12 16:05:12.868729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:52.426 [2024-07-12 16:05:12.868778] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:52.426 [2024-07-12 16:05:12.868943] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:52.684 [2024-07-12 16:05:12.918841] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:52.684 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:52.684 16:05:12 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:28:52.684 16:05:12 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:28:52.684 16:05:12 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:52.944 Malloc0 00:28:52.944 Malloc1 00:28:52.944 Malloc2 00:28:52.944 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:28:52.944 16:05:13 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:52.944 16:05:13 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:52.944 16:05:13 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:52.944 5000+0 records in 00:28:52.944 5000+0 records out 00:28:52.944 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0171692 s, 596 MB/s 00:28:52.944 16:05:13 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:52.944 AIO0 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2689548 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2689548 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2689548 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:53.204 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:53.464 spdk_thread ids are 1 on reactor0. 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2689548 0 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2689548 0 idle 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:53.464 16:05:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689548 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.24 reactor_0' 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689548 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.24 reactor_0 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2689548 1 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2689548 1 idle 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:53.725 16:05:13 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689552 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_1' 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689552 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_1 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2689548 2 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2689548 2 idle 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:53.725 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689553 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_2' 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689553 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_2 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:28:53.985 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:54.245 [2024-07-12 16:05:14.513378] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:54.245 [2024-07-12 16:05:14.513565] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:28:54.245 [2024-07-12 16:05:14.513814] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:54.245 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:54.505 [2024-07-12 16:05:14.705696] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:54.505 [2024-07-12 16:05:14.705960] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2689548 0 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2689548 0 busy 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689548 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.62 reactor_0' 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689548 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.62 reactor_0 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2689548 2 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2689548 2 busy 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:54.505 16:05:14 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689553 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.35 reactor_2' 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689553 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.35 reactor_2 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:54.765 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:55.025 [2024-07-12 16:05:15.251104] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:55.025 [2024-07-12 16:05:15.251242] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2689548 2 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2689548 2 idle 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689553 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.54 reactor_2' 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689553 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.54 reactor_2 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:55.025 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:55.285 [2024-07-12 16:05:15.624020] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:55.286 [2024-07-12 16:05:15.624236] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:28:55.286 [2024-07-12 16:05:15.624249] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2689548 0 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2689548 0 idle 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2689548 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2689548 -w 256 00:28:55.286 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2689548 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:01.36 reactor_0' 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2689548 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:01.36 reactor_0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:28:55.545 16:05:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2689548 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 2689548 ']' 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 2689548 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2689548 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2689548' 00:28:55.545 killing process with pid 2689548 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 2689548 00:28:55.545 16:05:15 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 2689548 00:28:55.805 16:05:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:28:55.805 16:05:16 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:55.805 00:28:55.805 real 0m8.097s 00:28:55.805 user 0m8.015s 00:28:55.805 sys 0m1.555s 00:28:55.805 16:05:16 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:55.805 16:05:16 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:55.805 ************************************ 00:28:55.805 END TEST reactor_set_interrupt 00:28:55.805 ************************************ 00:28:55.805 16:05:16 -- common/autotest_common.sh@1142 -- # return 0 00:28:55.805 16:05:16 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:55.805 16:05:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:28:55.805 16:05:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:55.805 16:05:16 -- common/autotest_common.sh@10 -- # set +x 00:28:55.805 ************************************ 00:28:55.805 START TEST reap_unregistered_poller 00:28:55.806 ************************************ 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:55.806 * Looking for test storage... 00:28:55.806 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:55.806 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:55.806 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:55.806 16:05:16 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:56.068 16:05:16 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:56.068 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:56.068 16:05:16 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:56.068 #define SPDK_CONFIG_H 00:28:56.068 #define SPDK_CONFIG_APPS 1 00:28:56.068 #define SPDK_CONFIG_ARCH native 00:28:56.068 #undef SPDK_CONFIG_ASAN 00:28:56.068 #undef SPDK_CONFIG_AVAHI 00:28:56.068 #undef SPDK_CONFIG_CET 00:28:56.068 #define SPDK_CONFIG_COVERAGE 1 00:28:56.068 #define SPDK_CONFIG_CROSS_PREFIX 00:28:56.068 #define SPDK_CONFIG_CRYPTO 1 00:28:56.068 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:56.068 #undef SPDK_CONFIG_CUSTOMOCF 00:28:56.068 #undef SPDK_CONFIG_DAOS 00:28:56.068 #define SPDK_CONFIG_DAOS_DIR 00:28:56.068 #define SPDK_CONFIG_DEBUG 1 00:28:56.068 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:56.068 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:56.068 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:56.068 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:56.069 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:56.069 #undef SPDK_CONFIG_DPDK_UADK 00:28:56.069 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:56.069 #define SPDK_CONFIG_EXAMPLES 1 00:28:56.069 #undef SPDK_CONFIG_FC 00:28:56.069 #define SPDK_CONFIG_FC_PATH 00:28:56.069 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:56.069 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:56.069 #undef SPDK_CONFIG_FUSE 00:28:56.069 #undef SPDK_CONFIG_FUZZER 00:28:56.069 #define SPDK_CONFIG_FUZZER_LIB 00:28:56.069 #undef SPDK_CONFIG_GOLANG 00:28:56.069 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:56.069 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:56.069 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:56.069 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:56.069 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:56.069 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:56.069 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:56.069 #define SPDK_CONFIG_IDXD 1 00:28:56.069 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:56.069 #define SPDK_CONFIG_IPSEC_MB 1 00:28:56.069 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:56.069 #define SPDK_CONFIG_ISAL 1 00:28:56.069 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:56.069 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:56.069 #define SPDK_CONFIG_LIBDIR 00:28:56.069 #undef SPDK_CONFIG_LTO 00:28:56.069 #define SPDK_CONFIG_MAX_LCORES 128 00:28:56.069 #define SPDK_CONFIG_NVME_CUSE 1 00:28:56.069 #undef SPDK_CONFIG_OCF 00:28:56.069 #define SPDK_CONFIG_OCF_PATH 00:28:56.069 #define SPDK_CONFIG_OPENSSL_PATH 00:28:56.069 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:56.069 #define SPDK_CONFIG_PGO_DIR 00:28:56.069 #undef SPDK_CONFIG_PGO_USE 00:28:56.069 #define SPDK_CONFIG_PREFIX /usr/local 00:28:56.069 #undef SPDK_CONFIG_RAID5F 00:28:56.069 #undef SPDK_CONFIG_RBD 00:28:56.069 #define SPDK_CONFIG_RDMA 1 00:28:56.069 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:56.069 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:56.069 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:56.069 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:56.069 #define SPDK_CONFIG_SHARED 1 00:28:56.069 #undef SPDK_CONFIG_SMA 00:28:56.069 #define SPDK_CONFIG_TESTS 1 00:28:56.069 #undef SPDK_CONFIG_TSAN 00:28:56.069 #define SPDK_CONFIG_UBLK 1 00:28:56.069 #define SPDK_CONFIG_UBSAN 1 00:28:56.069 #undef SPDK_CONFIG_UNIT_TESTS 00:28:56.069 #undef SPDK_CONFIG_URING 00:28:56.069 #define SPDK_CONFIG_URING_PATH 00:28:56.069 #undef SPDK_CONFIG_URING_ZNS 00:28:56.069 #undef SPDK_CONFIG_USDT 00:28:56.069 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:56.069 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:56.069 #undef SPDK_CONFIG_VFIO_USER 00:28:56.069 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:56.069 #define SPDK_CONFIG_VHOST 1 00:28:56.069 #define SPDK_CONFIG_VIRTIO 1 00:28:56.069 #undef SPDK_CONFIG_VTUNE 00:28:56.069 #define SPDK_CONFIG_VTUNE_DIR 00:28:56.069 #define SPDK_CONFIG_WERROR 1 00:28:56.069 #define SPDK_CONFIG_WPDK_DIR 00:28:56.069 #undef SPDK_CONFIG_XNVME 00:28:56.069 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:56.069 16:05:16 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:56.069 16:05:16 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.069 16:05:16 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.069 16:05:16 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.069 16:05:16 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:28:56.069 16:05:16 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:56.069 16:05:16 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:56.069 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:56.070 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j128 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2690215 ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2690215 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.RJcEmG 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.RJcEmG/tests/interrupt /tmp/spdk.RJcEmG 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=954712064 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4329717760 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=123809677312 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=129376292864 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=5566615552 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64683433984 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688144384 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=25865379840 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=25875259392 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9879552 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=efivarfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=efivarfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=339968 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=507904 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=163840 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=64687697920 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=64688148480 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=450560 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=12937621504 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=12937625600 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:28:56.071 * Looking for test storage... 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=123809677312 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=7781208064 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.071 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.072 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2690304 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:56.072 16:05:16 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2690304 /var/tmp/spdk.sock 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 2690304 ']' 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:56.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:56.072 16:05:16 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:56.072 [2024-07-12 16:05:16.437516] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:56.072 [2024-07-12 16:05:16.437579] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2690304 ] 00:28:56.331 [2024-07-12 16:05:16.527439] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:56.331 [2024-07-12 16:05:16.606236] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:56.331 [2024-07-12 16:05:16.606385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:56.331 [2024-07-12 16:05:16.606386] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:56.331 [2024-07-12 16:05:16.657739] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:56.897 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:56.897 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:28:56.897 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:28:56.897 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:28:56.897 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:56.897 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:56.897 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:56.898 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:28:56.898 "name": "app_thread", 00:28:56.898 "id": 1, 00:28:56.898 "active_pollers": [], 00:28:56.898 "timed_pollers": [ 00:28:56.898 { 00:28:56.898 "name": "rpc_subsystem_poll_servers", 00:28:56.898 "id": 1, 00:28:56.898 "state": "waiting", 00:28:56.898 "run_count": 0, 00:28:56.898 "busy_count": 0, 00:28:56.898 "period_ticks": 10400000 00:28:56.898 } 00:28:56.898 ], 00:28:56.898 "paused_pollers": [] 00:28:56.898 }' 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:57.156 5000+0 records in 00:28:57.156 5000+0 records out 00:28:57.156 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0174978 s, 585 MB/s 00:28:57.156 16:05:17 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:57.416 AIO0 00:28:57.416 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:57.675 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:28:57.675 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:28:57.675 16:05:17 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:28:57.675 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:28:57.675 16:05:17 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:57.675 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:28:57.675 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:28:57.675 "name": "app_thread", 00:28:57.675 "id": 1, 00:28:57.675 "active_pollers": [], 00:28:57.676 "timed_pollers": [ 00:28:57.676 { 00:28:57.676 "name": "rpc_subsystem_poll_servers", 00:28:57.676 "id": 1, 00:28:57.676 "state": "waiting", 00:28:57.676 "run_count": 0, 00:28:57.676 "busy_count": 0, 00:28:57.676 "period_ticks": 10400000 00:28:57.676 } 00:28:57.676 ], 00:28:57.676 "paused_pollers": [] 00:28:57.676 }' 00:28:57.676 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:28:57.676 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:28:57.676 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:28:57.676 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 2690304 ']' 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2690304' 00:28:57.935 killing process with pid 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 2690304 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:28:57.935 16:05:18 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:57.935 00:28:57.935 real 0m2.217s 00:28:57.935 user 0m1.263s 00:28:57.935 sys 0m0.605s 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:57.935 16:05:18 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:57.935 ************************************ 00:28:57.935 END TEST reap_unregistered_poller 00:28:57.935 ************************************ 00:28:57.935 16:05:18 -- common/autotest_common.sh@1142 -- # return 0 00:28:57.935 16:05:18 -- spdk/autotest.sh@198 -- # uname -s 00:28:58.195 16:05:18 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:28:58.195 16:05:18 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:28:58.195 16:05:18 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:28:58.195 16:05:18 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@260 -- # timing_exit lib 00:28:58.195 16:05:18 -- common/autotest_common.sh@728 -- # xtrace_disable 00:28:58.195 16:05:18 -- common/autotest_common.sh@10 -- # set +x 00:28:58.195 16:05:18 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:28:58.195 16:05:18 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:58.195 16:05:18 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:28:58.195 16:05:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:58.195 16:05:18 -- common/autotest_common.sh@10 -- # set +x 00:28:58.195 ************************************ 00:28:58.195 START TEST compress_compdev 00:28:58.195 ************************************ 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:58.195 * Looking for test storage... 00:28:58.195 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:58.195 16:05:18 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:58.195 16:05:18 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:58.195 16:05:18 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:58.195 16:05:18 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:58.195 16:05:18 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:58.195 16:05:18 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:58.195 16:05:18 compress_compdev -- paths/export.sh@5 -- # export PATH 00:28:58.195 16:05:18 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:58.195 16:05:18 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2690706 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2690706 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2690706 ']' 00:28:58.195 16:05:18 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:58.195 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:58.195 16:05:18 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:58.455 [2024-07-12 16:05:18.653962] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:28:58.455 [2024-07-12 16:05:18.654017] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2690706 ] 00:28:58.455 [2024-07-12 16:05:18.729929] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:58.455 [2024-07-12 16:05:18.824753] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:58.455 [2024-07-12 16:05:18.824846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:59.023 [2024-07-12 16:05:19.359014] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:59.023 16:05:19 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:59.023 16:05:19 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:28:59.023 16:05:19 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:28:59.023 16:05:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:59.023 16:05:19 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:02.317 [2024-07-12 16:05:22.499920] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15f7780 PMD being used: compress_qat 00:29:02.317 16:05:22 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:02.317 16:05:22 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:02.577 [ 00:29:02.577 { 00:29:02.577 "name": "Nvme0n1", 00:29:02.577 "aliases": [ 00:29:02.577 "b5b50b30-a574-4b63-bf1a-7437521b74e5" 00:29:02.577 ], 00:29:02.577 "product_name": "NVMe disk", 00:29:02.577 "block_size": 512, 00:29:02.577 "num_blocks": 3907029168, 00:29:02.577 "uuid": "b5b50b30-a574-4b63-bf1a-7437521b74e5", 00:29:02.577 "assigned_rate_limits": { 00:29:02.577 "rw_ios_per_sec": 0, 00:29:02.577 "rw_mbytes_per_sec": 0, 00:29:02.577 "r_mbytes_per_sec": 0, 00:29:02.577 "w_mbytes_per_sec": 0 00:29:02.577 }, 00:29:02.577 "claimed": false, 00:29:02.577 "zoned": false, 00:29:02.577 "supported_io_types": { 00:29:02.577 "read": true, 00:29:02.577 "write": true, 00:29:02.577 "unmap": true, 00:29:02.577 "flush": true, 00:29:02.577 "reset": true, 00:29:02.577 "nvme_admin": true, 00:29:02.577 "nvme_io": true, 00:29:02.577 "nvme_io_md": false, 00:29:02.577 "write_zeroes": true, 00:29:02.577 "zcopy": false, 00:29:02.577 "get_zone_info": false, 00:29:02.577 "zone_management": false, 00:29:02.577 "zone_append": false, 00:29:02.577 "compare": false, 00:29:02.577 "compare_and_write": false, 00:29:02.577 "abort": true, 00:29:02.577 "seek_hole": false, 00:29:02.577 "seek_data": false, 00:29:02.577 "copy": false, 00:29:02.577 "nvme_iov_md": false 00:29:02.577 }, 00:29:02.577 "driver_specific": { 00:29:02.577 "nvme": [ 00:29:02.577 { 00:29:02.577 "pci_address": "0000:65:00.0", 00:29:02.577 "trid": { 00:29:02.577 "trtype": "PCIe", 00:29:02.577 "traddr": "0000:65:00.0" 00:29:02.577 }, 00:29:02.577 "ctrlr_data": { 00:29:02.577 "cntlid": 0, 00:29:02.577 "vendor_id": "0x8086", 00:29:02.577 "model_number": "INTEL SSDPE2KX020T8", 00:29:02.577 "serial_number": "PHLJ9512038S2P0BGN", 00:29:02.577 "firmware_revision": "VDV10184", 00:29:02.577 "oacs": { 00:29:02.577 "security": 0, 00:29:02.577 "format": 1, 00:29:02.577 "firmware": 1, 00:29:02.577 "ns_manage": 1 00:29:02.577 }, 00:29:02.577 "multi_ctrlr": false, 00:29:02.577 "ana_reporting": false 00:29:02.577 }, 00:29:02.577 "vs": { 00:29:02.577 "nvme_version": "1.2" 00:29:02.577 }, 00:29:02.577 "ns_data": { 00:29:02.577 "id": 1, 00:29:02.577 "can_share": false 00:29:02.577 } 00:29:02.577 } 00:29:02.577 ], 00:29:02.577 "mp_policy": "active_passive" 00:29:02.577 } 00:29:02.577 } 00:29:02.577 ] 00:29:02.577 16:05:22 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:02.577 16:05:22 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:02.836 [2024-07-12 16:05:23.117443] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x145ca20 PMD being used: compress_qat 00:29:03.775 a14fedf9-4c4d-4143-ae72-74578660f2f7 00:29:03.775 16:05:24 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:04.036 ede0d444-143f-4f0f-9be0-5d012a07d122 00:29:04.036 16:05:24 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:04.036 16:05:24 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:04.296 16:05:24 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:04.556 [ 00:29:04.556 { 00:29:04.556 "name": "ede0d444-143f-4f0f-9be0-5d012a07d122", 00:29:04.556 "aliases": [ 00:29:04.556 "lvs0/lv0" 00:29:04.556 ], 00:29:04.556 "product_name": "Logical Volume", 00:29:04.556 "block_size": 512, 00:29:04.556 "num_blocks": 204800, 00:29:04.556 "uuid": "ede0d444-143f-4f0f-9be0-5d012a07d122", 00:29:04.556 "assigned_rate_limits": { 00:29:04.556 "rw_ios_per_sec": 0, 00:29:04.556 "rw_mbytes_per_sec": 0, 00:29:04.556 "r_mbytes_per_sec": 0, 00:29:04.556 "w_mbytes_per_sec": 0 00:29:04.556 }, 00:29:04.556 "claimed": false, 00:29:04.556 "zoned": false, 00:29:04.556 "supported_io_types": { 00:29:04.556 "read": true, 00:29:04.556 "write": true, 00:29:04.556 "unmap": true, 00:29:04.556 "flush": false, 00:29:04.556 "reset": true, 00:29:04.556 "nvme_admin": false, 00:29:04.556 "nvme_io": false, 00:29:04.556 "nvme_io_md": false, 00:29:04.556 "write_zeroes": true, 00:29:04.556 "zcopy": false, 00:29:04.556 "get_zone_info": false, 00:29:04.556 "zone_management": false, 00:29:04.556 "zone_append": false, 00:29:04.556 "compare": false, 00:29:04.556 "compare_and_write": false, 00:29:04.556 "abort": false, 00:29:04.556 "seek_hole": true, 00:29:04.556 "seek_data": true, 00:29:04.556 "copy": false, 00:29:04.556 "nvme_iov_md": false 00:29:04.556 }, 00:29:04.556 "driver_specific": { 00:29:04.556 "lvol": { 00:29:04.556 "lvol_store_uuid": "a14fedf9-4c4d-4143-ae72-74578660f2f7", 00:29:04.556 "base_bdev": "Nvme0n1", 00:29:04.556 "thin_provision": true, 00:29:04.556 "num_allocated_clusters": 0, 00:29:04.556 "snapshot": false, 00:29:04.556 "clone": false, 00:29:04.556 "esnap_clone": false 00:29:04.556 } 00:29:04.556 } 00:29:04.556 } 00:29:04.556 ] 00:29:04.556 16:05:24 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:04.556 16:05:24 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:04.556 16:05:24 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:04.556 [2024-07-12 16:05:24.999277] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:04.556 COMP_lvs0/lv0 00:29:04.845 16:05:25 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:04.845 16:05:25 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:05.168 [ 00:29:05.168 { 00:29:05.168 "name": "COMP_lvs0/lv0", 00:29:05.168 "aliases": [ 00:29:05.168 "e0539f21-2b41-51e8-a9aa-3ed88e271f4e" 00:29:05.168 ], 00:29:05.168 "product_name": "compress", 00:29:05.168 "block_size": 512, 00:29:05.168 "num_blocks": 200704, 00:29:05.168 "uuid": "e0539f21-2b41-51e8-a9aa-3ed88e271f4e", 00:29:05.168 "assigned_rate_limits": { 00:29:05.168 "rw_ios_per_sec": 0, 00:29:05.168 "rw_mbytes_per_sec": 0, 00:29:05.168 "r_mbytes_per_sec": 0, 00:29:05.168 "w_mbytes_per_sec": 0 00:29:05.168 }, 00:29:05.168 "claimed": false, 00:29:05.168 "zoned": false, 00:29:05.168 "supported_io_types": { 00:29:05.168 "read": true, 00:29:05.168 "write": true, 00:29:05.168 "unmap": false, 00:29:05.168 "flush": false, 00:29:05.168 "reset": false, 00:29:05.168 "nvme_admin": false, 00:29:05.168 "nvme_io": false, 00:29:05.168 "nvme_io_md": false, 00:29:05.168 "write_zeroes": true, 00:29:05.168 "zcopy": false, 00:29:05.168 "get_zone_info": false, 00:29:05.168 "zone_management": false, 00:29:05.168 "zone_append": false, 00:29:05.168 "compare": false, 00:29:05.168 "compare_and_write": false, 00:29:05.168 "abort": false, 00:29:05.168 "seek_hole": false, 00:29:05.168 "seek_data": false, 00:29:05.168 "copy": false, 00:29:05.168 "nvme_iov_md": false 00:29:05.168 }, 00:29:05.168 "driver_specific": { 00:29:05.168 "compress": { 00:29:05.168 "name": "COMP_lvs0/lv0", 00:29:05.168 "base_bdev_name": "ede0d444-143f-4f0f-9be0-5d012a07d122" 00:29:05.168 } 00:29:05.168 } 00:29:05.168 } 00:29:05.168 ] 00:29:05.168 16:05:25 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:05.168 16:05:25 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:05.168 [2024-07-12 16:05:25.544985] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f4a181b15c0 PMD being used: compress_qat 00:29:05.168 [2024-07-12 16:05:25.547763] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15f4910 PMD being used: compress_qat 00:29:05.168 Running I/O for 3 seconds... 00:29:08.469 00:29:08.469 Latency(us) 00:29:08.469 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:08.469 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:08.469 Verification LBA range: start 0x0 length 0x3100 00:29:08.469 COMP_lvs0/lv0 : 3.01 1530.90 5.98 0.00 0.00 20819.85 434.81 22483.89 00:29:08.469 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:08.469 Verification LBA range: start 0x3100 length 0x3100 00:29:08.469 COMP_lvs0/lv0 : 3.01 1612.40 6.30 0.00 0.00 19717.26 384.39 22584.71 00:29:08.469 =================================================================================================================== 00:29:08.469 Total : 3143.31 12.28 0.00 0.00 20254.17 384.39 22584.71 00:29:08.469 0 00:29:08.469 16:05:28 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:08.469 16:05:28 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:08.469 16:05:28 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:08.729 16:05:29 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:08.729 16:05:29 compress_compdev -- compress/compress.sh@78 -- # killprocess 2690706 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2690706 ']' 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2690706 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2690706 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2690706' 00:29:08.729 killing process with pid 2690706 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@967 -- # kill 2690706 00:29:08.729 Received shutdown signal, test time was about 3.000000 seconds 00:29:08.729 00:29:08.729 Latency(us) 00:29:08.729 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:08.729 =================================================================================================================== 00:29:08.729 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:08.729 16:05:29 compress_compdev -- common/autotest_common.sh@972 -- # wait 2690706 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2692808 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2692808 00:29:11.272 16:05:31 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2692808 ']' 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:11.272 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:11.272 16:05:31 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:11.272 [2024-07-12 16:05:31.591363] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:29:11.272 [2024-07-12 16:05:31.591432] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2692808 ] 00:29:11.272 [2024-07-12 16:05:31.672803] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:11.533 [2024-07-12 16:05:31.773391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:11.533 [2024-07-12 16:05:31.773395] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:12.104 [2024-07-12 16:05:32.313187] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:12.104 16:05:32 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:12.104 16:05:32 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:12.104 16:05:32 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:29:12.104 16:05:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:12.104 16:05:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:15.401 [2024-07-12 16:05:35.476054] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x263b780 PMD being used: compress_qat 00:29:15.401 16:05:35 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:15.401 16:05:35 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:15.661 [ 00:29:15.661 { 00:29:15.661 "name": "Nvme0n1", 00:29:15.661 "aliases": [ 00:29:15.661 "c6a3a497-a8d5-4668-8589-243b5f5ef978" 00:29:15.661 ], 00:29:15.661 "product_name": "NVMe disk", 00:29:15.661 "block_size": 512, 00:29:15.661 "num_blocks": 3907029168, 00:29:15.661 "uuid": "c6a3a497-a8d5-4668-8589-243b5f5ef978", 00:29:15.661 "assigned_rate_limits": { 00:29:15.661 "rw_ios_per_sec": 0, 00:29:15.661 "rw_mbytes_per_sec": 0, 00:29:15.661 "r_mbytes_per_sec": 0, 00:29:15.661 "w_mbytes_per_sec": 0 00:29:15.661 }, 00:29:15.661 "claimed": false, 00:29:15.661 "zoned": false, 00:29:15.661 "supported_io_types": { 00:29:15.661 "read": true, 00:29:15.661 "write": true, 00:29:15.661 "unmap": true, 00:29:15.661 "flush": true, 00:29:15.661 "reset": true, 00:29:15.661 "nvme_admin": true, 00:29:15.661 "nvme_io": true, 00:29:15.661 "nvme_io_md": false, 00:29:15.661 "write_zeroes": true, 00:29:15.661 "zcopy": false, 00:29:15.661 "get_zone_info": false, 00:29:15.661 "zone_management": false, 00:29:15.661 "zone_append": false, 00:29:15.661 "compare": false, 00:29:15.661 "compare_and_write": false, 00:29:15.661 "abort": true, 00:29:15.661 "seek_hole": false, 00:29:15.661 "seek_data": false, 00:29:15.661 "copy": false, 00:29:15.661 "nvme_iov_md": false 00:29:15.661 }, 00:29:15.662 "driver_specific": { 00:29:15.662 "nvme": [ 00:29:15.662 { 00:29:15.662 "pci_address": "0000:65:00.0", 00:29:15.662 "trid": { 00:29:15.662 "trtype": "PCIe", 00:29:15.662 "traddr": "0000:65:00.0" 00:29:15.662 }, 00:29:15.662 "ctrlr_data": { 00:29:15.662 "cntlid": 0, 00:29:15.662 "vendor_id": "0x8086", 00:29:15.662 "model_number": "INTEL SSDPE2KX020T8", 00:29:15.662 "serial_number": "PHLJ9512038S2P0BGN", 00:29:15.662 "firmware_revision": "VDV10184", 00:29:15.662 "oacs": { 00:29:15.662 "security": 0, 00:29:15.662 "format": 1, 00:29:15.662 "firmware": 1, 00:29:15.662 "ns_manage": 1 00:29:15.662 }, 00:29:15.662 "multi_ctrlr": false, 00:29:15.662 "ana_reporting": false 00:29:15.662 }, 00:29:15.662 "vs": { 00:29:15.662 "nvme_version": "1.2" 00:29:15.662 }, 00:29:15.662 "ns_data": { 00:29:15.662 "id": 1, 00:29:15.662 "can_share": false 00:29:15.662 } 00:29:15.662 } 00:29:15.662 ], 00:29:15.662 "mp_policy": "active_passive" 00:29:15.662 } 00:29:15.662 } 00:29:15.662 ] 00:29:15.662 16:05:35 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:15.662 16:05:35 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:15.662 [2024-07-12 16:05:36.089644] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2473940 PMD being used: compress_qat 00:29:17.050 a88742b1-97d4-4cdb-ac84-4dd63575c890 00:29:17.050 16:05:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:17.050 8461ba22-fb1f-469f-bc96-44824dd1fe25 00:29:17.050 16:05:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:17.050 16:05:37 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:17.310 16:05:37 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:17.571 [ 00:29:17.571 { 00:29:17.571 "name": "8461ba22-fb1f-469f-bc96-44824dd1fe25", 00:29:17.571 "aliases": [ 00:29:17.571 "lvs0/lv0" 00:29:17.571 ], 00:29:17.571 "product_name": "Logical Volume", 00:29:17.571 "block_size": 512, 00:29:17.571 "num_blocks": 204800, 00:29:17.571 "uuid": "8461ba22-fb1f-469f-bc96-44824dd1fe25", 00:29:17.571 "assigned_rate_limits": { 00:29:17.571 "rw_ios_per_sec": 0, 00:29:17.571 "rw_mbytes_per_sec": 0, 00:29:17.571 "r_mbytes_per_sec": 0, 00:29:17.571 "w_mbytes_per_sec": 0 00:29:17.571 }, 00:29:17.571 "claimed": false, 00:29:17.571 "zoned": false, 00:29:17.571 "supported_io_types": { 00:29:17.571 "read": true, 00:29:17.571 "write": true, 00:29:17.571 "unmap": true, 00:29:17.571 "flush": false, 00:29:17.571 "reset": true, 00:29:17.571 "nvme_admin": false, 00:29:17.571 "nvme_io": false, 00:29:17.571 "nvme_io_md": false, 00:29:17.571 "write_zeroes": true, 00:29:17.571 "zcopy": false, 00:29:17.571 "get_zone_info": false, 00:29:17.571 "zone_management": false, 00:29:17.571 "zone_append": false, 00:29:17.571 "compare": false, 00:29:17.571 "compare_and_write": false, 00:29:17.571 "abort": false, 00:29:17.571 "seek_hole": true, 00:29:17.571 "seek_data": true, 00:29:17.571 "copy": false, 00:29:17.571 "nvme_iov_md": false 00:29:17.571 }, 00:29:17.571 "driver_specific": { 00:29:17.571 "lvol": { 00:29:17.571 "lvol_store_uuid": "a88742b1-97d4-4cdb-ac84-4dd63575c890", 00:29:17.571 "base_bdev": "Nvme0n1", 00:29:17.571 "thin_provision": true, 00:29:17.571 "num_allocated_clusters": 0, 00:29:17.571 "snapshot": false, 00:29:17.571 "clone": false, 00:29:17.571 "esnap_clone": false 00:29:17.571 } 00:29:17.571 } 00:29:17.571 } 00:29:17.571 ] 00:29:17.571 16:05:37 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:17.571 16:05:37 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:17.571 16:05:37 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:17.571 [2024-07-12 16:05:38.017366] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:17.571 COMP_lvs0/lv0 00:29:17.832 16:05:38 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:17.832 16:05:38 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:18.092 [ 00:29:18.092 { 00:29:18.092 "name": "COMP_lvs0/lv0", 00:29:18.092 "aliases": [ 00:29:18.092 "384e5dd3-8562-520d-850d-1a47644be29f" 00:29:18.092 ], 00:29:18.092 "product_name": "compress", 00:29:18.092 "block_size": 512, 00:29:18.092 "num_blocks": 200704, 00:29:18.092 "uuid": "384e5dd3-8562-520d-850d-1a47644be29f", 00:29:18.092 "assigned_rate_limits": { 00:29:18.092 "rw_ios_per_sec": 0, 00:29:18.092 "rw_mbytes_per_sec": 0, 00:29:18.092 "r_mbytes_per_sec": 0, 00:29:18.092 "w_mbytes_per_sec": 0 00:29:18.092 }, 00:29:18.092 "claimed": false, 00:29:18.092 "zoned": false, 00:29:18.092 "supported_io_types": { 00:29:18.092 "read": true, 00:29:18.092 "write": true, 00:29:18.092 "unmap": false, 00:29:18.092 "flush": false, 00:29:18.092 "reset": false, 00:29:18.092 "nvme_admin": false, 00:29:18.092 "nvme_io": false, 00:29:18.092 "nvme_io_md": false, 00:29:18.092 "write_zeroes": true, 00:29:18.092 "zcopy": false, 00:29:18.092 "get_zone_info": false, 00:29:18.092 "zone_management": false, 00:29:18.092 "zone_append": false, 00:29:18.092 "compare": false, 00:29:18.092 "compare_and_write": false, 00:29:18.092 "abort": false, 00:29:18.092 "seek_hole": false, 00:29:18.092 "seek_data": false, 00:29:18.092 "copy": false, 00:29:18.092 "nvme_iov_md": false 00:29:18.092 }, 00:29:18.092 "driver_specific": { 00:29:18.092 "compress": { 00:29:18.092 "name": "COMP_lvs0/lv0", 00:29:18.092 "base_bdev_name": "8461ba22-fb1f-469f-bc96-44824dd1fe25" 00:29:18.092 } 00:29:18.092 } 00:29:18.092 } 00:29:18.092 ] 00:29:18.092 16:05:38 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:18.092 16:05:38 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:18.092 [2024-07-12 16:05:38.534901] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8e981b15c0 PMD being used: compress_qat 00:29:18.092 [2024-07-12 16:05:38.537737] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x248af30 PMD being used: compress_qat 00:29:18.092 Running I/O for 3 seconds... 00:29:21.394 00:29:21.394 Latency(us) 00:29:21.394 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.394 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:21.394 Verification LBA range: start 0x0 length 0x3100 00:29:21.394 COMP_lvs0/lv0 : 3.01 1539.16 6.01 0.00 0.00 20721.17 494.67 21778.12 00:29:21.394 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:21.394 Verification LBA range: start 0x3100 length 0x3100 00:29:21.394 COMP_lvs0/lv0 : 3.01 1620.25 6.33 0.00 0.00 19625.50 332.41 22383.06 00:29:21.394 =================================================================================================================== 00:29:21.394 Total : 3159.41 12.34 0.00 0.00 20159.23 332.41 22383.06 00:29:21.394 0 00:29:21.394 16:05:41 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:21.394 16:05:41 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:21.394 16:05:41 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:21.655 16:05:41 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:21.655 16:05:42 compress_compdev -- compress/compress.sh@78 -- # killprocess 2692808 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2692808 ']' 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2692808 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2692808 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2692808' 00:29:21.655 killing process with pid 2692808 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@967 -- # kill 2692808 00:29:21.655 Received shutdown signal, test time was about 3.000000 seconds 00:29:21.655 00:29:21.655 Latency(us) 00:29:21.655 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:21.655 =================================================================================================================== 00:29:21.655 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:21.655 16:05:42 compress_compdev -- common/autotest_common.sh@972 -- # wait 2692808 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2694757 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2694757 00:29:24.210 16:05:44 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2694757 ']' 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:24.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:24.210 16:05:44 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:24.210 [2024-07-12 16:05:44.539309] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:29:24.210 [2024-07-12 16:05:44.539376] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2694757 ] 00:29:24.210 [2024-07-12 16:05:44.623525] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:24.470 [2024-07-12 16:05:44.726504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:24.470 [2024-07-12 16:05:44.726506] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:25.040 [2024-07-12 16:05:45.268141] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:25.040 16:05:45 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:25.040 16:05:45 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:25.040 16:05:45 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:29:25.040 16:05:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:25.040 16:05:45 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:28.336 [2024-07-12 16:05:48.452049] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x207a780 PMD being used: compress_qat 00:29:28.336 16:05:48 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:28.336 16:05:48 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:28.595 [ 00:29:28.595 { 00:29:28.595 "name": "Nvme0n1", 00:29:28.595 "aliases": [ 00:29:28.595 "f56158d8-7941-4828-a2fc-204e8e50fd18" 00:29:28.595 ], 00:29:28.595 "product_name": "NVMe disk", 00:29:28.595 "block_size": 512, 00:29:28.595 "num_blocks": 3907029168, 00:29:28.595 "uuid": "f56158d8-7941-4828-a2fc-204e8e50fd18", 00:29:28.595 "assigned_rate_limits": { 00:29:28.595 "rw_ios_per_sec": 0, 00:29:28.595 "rw_mbytes_per_sec": 0, 00:29:28.595 "r_mbytes_per_sec": 0, 00:29:28.595 "w_mbytes_per_sec": 0 00:29:28.595 }, 00:29:28.595 "claimed": false, 00:29:28.595 "zoned": false, 00:29:28.595 "supported_io_types": { 00:29:28.595 "read": true, 00:29:28.595 "write": true, 00:29:28.595 "unmap": true, 00:29:28.595 "flush": true, 00:29:28.595 "reset": true, 00:29:28.595 "nvme_admin": true, 00:29:28.595 "nvme_io": true, 00:29:28.595 "nvme_io_md": false, 00:29:28.595 "write_zeroes": true, 00:29:28.595 "zcopy": false, 00:29:28.595 "get_zone_info": false, 00:29:28.595 "zone_management": false, 00:29:28.595 "zone_append": false, 00:29:28.595 "compare": false, 00:29:28.595 "compare_and_write": false, 00:29:28.595 "abort": true, 00:29:28.595 "seek_hole": false, 00:29:28.595 "seek_data": false, 00:29:28.595 "copy": false, 00:29:28.595 "nvme_iov_md": false 00:29:28.595 }, 00:29:28.595 "driver_specific": { 00:29:28.595 "nvme": [ 00:29:28.595 { 00:29:28.595 "pci_address": "0000:65:00.0", 00:29:28.595 "trid": { 00:29:28.595 "trtype": "PCIe", 00:29:28.595 "traddr": "0000:65:00.0" 00:29:28.595 }, 00:29:28.595 "ctrlr_data": { 00:29:28.595 "cntlid": 0, 00:29:28.595 "vendor_id": "0x8086", 00:29:28.595 "model_number": "INTEL SSDPE2KX020T8", 00:29:28.595 "serial_number": "PHLJ9512038S2P0BGN", 00:29:28.595 "firmware_revision": "VDV10184", 00:29:28.595 "oacs": { 00:29:28.595 "security": 0, 00:29:28.595 "format": 1, 00:29:28.595 "firmware": 1, 00:29:28.595 "ns_manage": 1 00:29:28.595 }, 00:29:28.595 "multi_ctrlr": false, 00:29:28.595 "ana_reporting": false 00:29:28.595 }, 00:29:28.595 "vs": { 00:29:28.595 "nvme_version": "1.2" 00:29:28.595 }, 00:29:28.595 "ns_data": { 00:29:28.595 "id": 1, 00:29:28.595 "can_share": false 00:29:28.595 } 00:29:28.595 } 00:29:28.595 ], 00:29:28.595 "mp_policy": "active_passive" 00:29:28.595 } 00:29:28.595 } 00:29:28.595 ] 00:29:28.595 16:05:48 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:28.595 16:05:48 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:28.854 [2024-07-12 16:05:49.141544] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1eb2940 PMD being used: compress_qat 00:29:29.792 707bd857-0154-43ea-b485-8782cd579f01 00:29:29.792 16:05:50 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:30.052 1cfde57b-ee07-4abc-bd4b-062e42c4da76 00:29:30.052 16:05:50 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:30.052 16:05:50 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:30.347 16:05:50 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:30.607 [ 00:29:30.607 { 00:29:30.607 "name": "1cfde57b-ee07-4abc-bd4b-062e42c4da76", 00:29:30.607 "aliases": [ 00:29:30.607 "lvs0/lv0" 00:29:30.607 ], 00:29:30.607 "product_name": "Logical Volume", 00:29:30.607 "block_size": 512, 00:29:30.607 "num_blocks": 204800, 00:29:30.607 "uuid": "1cfde57b-ee07-4abc-bd4b-062e42c4da76", 00:29:30.607 "assigned_rate_limits": { 00:29:30.607 "rw_ios_per_sec": 0, 00:29:30.607 "rw_mbytes_per_sec": 0, 00:29:30.607 "r_mbytes_per_sec": 0, 00:29:30.607 "w_mbytes_per_sec": 0 00:29:30.607 }, 00:29:30.607 "claimed": false, 00:29:30.607 "zoned": false, 00:29:30.607 "supported_io_types": { 00:29:30.607 "read": true, 00:29:30.607 "write": true, 00:29:30.607 "unmap": true, 00:29:30.607 "flush": false, 00:29:30.607 "reset": true, 00:29:30.607 "nvme_admin": false, 00:29:30.607 "nvme_io": false, 00:29:30.607 "nvme_io_md": false, 00:29:30.607 "write_zeroes": true, 00:29:30.607 "zcopy": false, 00:29:30.607 "get_zone_info": false, 00:29:30.607 "zone_management": false, 00:29:30.607 "zone_append": false, 00:29:30.607 "compare": false, 00:29:30.607 "compare_and_write": false, 00:29:30.607 "abort": false, 00:29:30.607 "seek_hole": true, 00:29:30.607 "seek_data": true, 00:29:30.607 "copy": false, 00:29:30.607 "nvme_iov_md": false 00:29:30.607 }, 00:29:30.607 "driver_specific": { 00:29:30.607 "lvol": { 00:29:30.607 "lvol_store_uuid": "707bd857-0154-43ea-b485-8782cd579f01", 00:29:30.607 "base_bdev": "Nvme0n1", 00:29:30.607 "thin_provision": true, 00:29:30.607 "num_allocated_clusters": 0, 00:29:30.607 "snapshot": false, 00:29:30.607 "clone": false, 00:29:30.607 "esnap_clone": false 00:29:30.607 } 00:29:30.607 } 00:29:30.607 } 00:29:30.607 ] 00:29:30.607 16:05:50 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:30.607 16:05:50 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:29:30.607 16:05:50 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:29:30.868 [2024-07-12 16:05:51.108394] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:30.868 COMP_lvs0/lv0 00:29:30.868 16:05:51 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:30.868 16:05:51 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:31.128 16:05:51 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:31.128 [ 00:29:31.128 { 00:29:31.128 "name": "COMP_lvs0/lv0", 00:29:31.128 "aliases": [ 00:29:31.128 "b7e47d1c-a8f7-54ae-8672-4e5c1f196f0a" 00:29:31.128 ], 00:29:31.128 "product_name": "compress", 00:29:31.128 "block_size": 4096, 00:29:31.128 "num_blocks": 25088, 00:29:31.128 "uuid": "b7e47d1c-a8f7-54ae-8672-4e5c1f196f0a", 00:29:31.128 "assigned_rate_limits": { 00:29:31.128 "rw_ios_per_sec": 0, 00:29:31.128 "rw_mbytes_per_sec": 0, 00:29:31.128 "r_mbytes_per_sec": 0, 00:29:31.128 "w_mbytes_per_sec": 0 00:29:31.128 }, 00:29:31.128 "claimed": false, 00:29:31.128 "zoned": false, 00:29:31.128 "supported_io_types": { 00:29:31.128 "read": true, 00:29:31.128 "write": true, 00:29:31.128 "unmap": false, 00:29:31.128 "flush": false, 00:29:31.128 "reset": false, 00:29:31.128 "nvme_admin": false, 00:29:31.128 "nvme_io": false, 00:29:31.128 "nvme_io_md": false, 00:29:31.128 "write_zeroes": true, 00:29:31.128 "zcopy": false, 00:29:31.128 "get_zone_info": false, 00:29:31.128 "zone_management": false, 00:29:31.128 "zone_append": false, 00:29:31.128 "compare": false, 00:29:31.128 "compare_and_write": false, 00:29:31.128 "abort": false, 00:29:31.128 "seek_hole": false, 00:29:31.128 "seek_data": false, 00:29:31.128 "copy": false, 00:29:31.128 "nvme_iov_md": false 00:29:31.128 }, 00:29:31.128 "driver_specific": { 00:29:31.128 "compress": { 00:29:31.128 "name": "COMP_lvs0/lv0", 00:29:31.128 "base_bdev_name": "1cfde57b-ee07-4abc-bd4b-062e42c4da76" 00:29:31.128 } 00:29:31.128 } 00:29:31.128 } 00:29:31.128 ] 00:29:31.388 16:05:51 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:31.388 16:05:51 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:31.388 [2024-07-12 16:05:51.698071] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fc8341b15c0 PMD being used: compress_qat 00:29:31.388 [2024-07-12 16:05:51.700844] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2077aa0 PMD being used: compress_qat 00:29:31.388 Running I/O for 3 seconds... 00:29:34.684 00:29:34.684 Latency(us) 00:29:34.684 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:34.684 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:34.684 Verification LBA range: start 0x0 length 0x3100 00:29:34.684 COMP_lvs0/lv0 : 3.01 1542.53 6.03 0.00 0.00 20651.85 500.97 22282.24 00:29:34.684 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:34.684 Verification LBA range: start 0x3100 length 0x3100 00:29:34.684 COMP_lvs0/lv0 : 3.01 1620.62 6.33 0.00 0.00 19630.88 256.79 22080.59 00:29:34.684 =================================================================================================================== 00:29:34.684 Total : 3163.15 12.36 0.00 0.00 20129.04 256.79 22282.24 00:29:34.684 0 00:29:34.684 16:05:54 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:34.684 16:05:54 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:34.684 16:05:54 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:34.944 16:05:55 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:34.944 16:05:55 compress_compdev -- compress/compress.sh@78 -- # killprocess 2694757 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2694757 ']' 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2694757 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2694757 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2694757' 00:29:34.944 killing process with pid 2694757 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@967 -- # kill 2694757 00:29:34.944 Received shutdown signal, test time was about 3.000000 seconds 00:29:34.944 00:29:34.944 Latency(us) 00:29:34.944 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:34.944 =================================================================================================================== 00:29:34.944 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:34.944 16:05:55 compress_compdev -- common/autotest_common.sh@972 -- # wait 2694757 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2696906 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2696906 00:29:37.498 16:05:57 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 2696906 ']' 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:37.498 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:37.498 16:05:57 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:37.498 [2024-07-12 16:05:57.742161] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:29:37.498 [2024-07-12 16:05:57.742231] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2696906 ] 00:29:37.498 [2024-07-12 16:05:57.833293] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:37.498 [2024-07-12 16:05:57.929797] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:37.498 [2024-07-12 16:05:57.929992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:37.498 [2024-07-12 16:05:57.929992] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:38.068 [2024-07-12 16:05:58.401637] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:38.328 16:05:58 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:38.328 16:05:58 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:29:38.328 16:05:58 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:29:38.328 16:05:58 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:38.328 16:05:58 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:41.623 [2024-07-12 16:06:01.659631] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22182e0 PMD being used: compress_qat 00:29:41.623 16:06:01 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:41.623 16:06:01 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:41.883 [ 00:29:41.883 { 00:29:41.883 "name": "Nvme0n1", 00:29:41.883 "aliases": [ 00:29:41.883 "306dc520-3a31-4a3a-8af6-ad829a673c28" 00:29:41.883 ], 00:29:41.883 "product_name": "NVMe disk", 00:29:41.883 "block_size": 512, 00:29:41.883 "num_blocks": 3907029168, 00:29:41.883 "uuid": "306dc520-3a31-4a3a-8af6-ad829a673c28", 00:29:41.883 "assigned_rate_limits": { 00:29:41.883 "rw_ios_per_sec": 0, 00:29:41.883 "rw_mbytes_per_sec": 0, 00:29:41.883 "r_mbytes_per_sec": 0, 00:29:41.883 "w_mbytes_per_sec": 0 00:29:41.883 }, 00:29:41.883 "claimed": false, 00:29:41.883 "zoned": false, 00:29:41.883 "supported_io_types": { 00:29:41.883 "read": true, 00:29:41.883 "write": true, 00:29:41.883 "unmap": true, 00:29:41.883 "flush": true, 00:29:41.883 "reset": true, 00:29:41.883 "nvme_admin": true, 00:29:41.883 "nvme_io": true, 00:29:41.883 "nvme_io_md": false, 00:29:41.883 "write_zeroes": true, 00:29:41.883 "zcopy": false, 00:29:41.883 "get_zone_info": false, 00:29:41.883 "zone_management": false, 00:29:41.883 "zone_append": false, 00:29:41.883 "compare": false, 00:29:41.883 "compare_and_write": false, 00:29:41.883 "abort": true, 00:29:41.883 "seek_hole": false, 00:29:41.883 "seek_data": false, 00:29:41.883 "copy": false, 00:29:41.883 "nvme_iov_md": false 00:29:41.883 }, 00:29:41.883 "driver_specific": { 00:29:41.883 "nvme": [ 00:29:41.883 { 00:29:41.883 "pci_address": "0000:65:00.0", 00:29:41.883 "trid": { 00:29:41.883 "trtype": "PCIe", 00:29:41.883 "traddr": "0000:65:00.0" 00:29:41.883 }, 00:29:41.883 "ctrlr_data": { 00:29:41.883 "cntlid": 0, 00:29:41.883 "vendor_id": "0x8086", 00:29:41.883 "model_number": "INTEL SSDPE2KX020T8", 00:29:41.883 "serial_number": "PHLJ9512038S2P0BGN", 00:29:41.883 "firmware_revision": "VDV10184", 00:29:41.883 "oacs": { 00:29:41.883 "security": 0, 00:29:41.883 "format": 1, 00:29:41.883 "firmware": 1, 00:29:41.883 "ns_manage": 1 00:29:41.883 }, 00:29:41.883 "multi_ctrlr": false, 00:29:41.883 "ana_reporting": false 00:29:41.883 }, 00:29:41.883 "vs": { 00:29:41.883 "nvme_version": "1.2" 00:29:41.883 }, 00:29:41.883 "ns_data": { 00:29:41.883 "id": 1, 00:29:41.883 "can_share": false 00:29:41.883 } 00:29:41.883 } 00:29:41.883 ], 00:29:41.883 "mp_policy": "active_passive" 00:29:41.883 } 00:29:41.883 } 00:29:41.883 ] 00:29:41.883 16:06:02 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:41.883 16:06:02 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:42.143 [2024-07-12 16:06:02.348997] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2066940 PMD being used: compress_qat 00:29:43.081 7dd91488-97fd-46cb-a51e-8131b3466b46 00:29:43.081 16:06:03 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:43.340 4657f5d3-bcba-4e60-b177-c60f69713a4b 00:29:43.340 16:06:03 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:43.340 16:06:03 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:43.599 16:06:03 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:43.599 [ 00:29:43.599 { 00:29:43.599 "name": "4657f5d3-bcba-4e60-b177-c60f69713a4b", 00:29:43.599 "aliases": [ 00:29:43.599 "lvs0/lv0" 00:29:43.599 ], 00:29:43.599 "product_name": "Logical Volume", 00:29:43.599 "block_size": 512, 00:29:43.600 "num_blocks": 204800, 00:29:43.600 "uuid": "4657f5d3-bcba-4e60-b177-c60f69713a4b", 00:29:43.600 "assigned_rate_limits": { 00:29:43.600 "rw_ios_per_sec": 0, 00:29:43.600 "rw_mbytes_per_sec": 0, 00:29:43.600 "r_mbytes_per_sec": 0, 00:29:43.600 "w_mbytes_per_sec": 0 00:29:43.600 }, 00:29:43.600 "claimed": false, 00:29:43.600 "zoned": false, 00:29:43.600 "supported_io_types": { 00:29:43.600 "read": true, 00:29:43.600 "write": true, 00:29:43.600 "unmap": true, 00:29:43.600 "flush": false, 00:29:43.600 "reset": true, 00:29:43.600 "nvme_admin": false, 00:29:43.600 "nvme_io": false, 00:29:43.600 "nvme_io_md": false, 00:29:43.600 "write_zeroes": true, 00:29:43.600 "zcopy": false, 00:29:43.600 "get_zone_info": false, 00:29:43.600 "zone_management": false, 00:29:43.600 "zone_append": false, 00:29:43.600 "compare": false, 00:29:43.600 "compare_and_write": false, 00:29:43.600 "abort": false, 00:29:43.600 "seek_hole": true, 00:29:43.600 "seek_data": true, 00:29:43.600 "copy": false, 00:29:43.600 "nvme_iov_md": false 00:29:43.600 }, 00:29:43.600 "driver_specific": { 00:29:43.600 "lvol": { 00:29:43.600 "lvol_store_uuid": "7dd91488-97fd-46cb-a51e-8131b3466b46", 00:29:43.600 "base_bdev": "Nvme0n1", 00:29:43.600 "thin_provision": true, 00:29:43.600 "num_allocated_clusters": 0, 00:29:43.600 "snapshot": false, 00:29:43.600 "clone": false, 00:29:43.600 "esnap_clone": false 00:29:43.600 } 00:29:43.600 } 00:29:43.600 } 00:29:43.600 ] 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:43.859 16:06:04 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:43.859 16:06:04 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:43.859 [2024-07-12 16:06:04.253450] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:43.859 COMP_lvs0/lv0 00:29:43.859 16:06:04 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:43.859 16:06:04 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:44.119 16:06:04 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:44.380 [ 00:29:44.380 { 00:29:44.380 "name": "COMP_lvs0/lv0", 00:29:44.380 "aliases": [ 00:29:44.380 "c611e8fe-4def-50c0-bda6-497edd47e930" 00:29:44.380 ], 00:29:44.380 "product_name": "compress", 00:29:44.380 "block_size": 512, 00:29:44.380 "num_blocks": 200704, 00:29:44.380 "uuid": "c611e8fe-4def-50c0-bda6-497edd47e930", 00:29:44.380 "assigned_rate_limits": { 00:29:44.380 "rw_ios_per_sec": 0, 00:29:44.380 "rw_mbytes_per_sec": 0, 00:29:44.380 "r_mbytes_per_sec": 0, 00:29:44.380 "w_mbytes_per_sec": 0 00:29:44.380 }, 00:29:44.380 "claimed": false, 00:29:44.380 "zoned": false, 00:29:44.380 "supported_io_types": { 00:29:44.380 "read": true, 00:29:44.380 "write": true, 00:29:44.380 "unmap": false, 00:29:44.380 "flush": false, 00:29:44.380 "reset": false, 00:29:44.380 "nvme_admin": false, 00:29:44.380 "nvme_io": false, 00:29:44.380 "nvme_io_md": false, 00:29:44.380 "write_zeroes": true, 00:29:44.380 "zcopy": false, 00:29:44.380 "get_zone_info": false, 00:29:44.380 "zone_management": false, 00:29:44.380 "zone_append": false, 00:29:44.380 "compare": false, 00:29:44.380 "compare_and_write": false, 00:29:44.380 "abort": false, 00:29:44.380 "seek_hole": false, 00:29:44.380 "seek_data": false, 00:29:44.380 "copy": false, 00:29:44.380 "nvme_iov_md": false 00:29:44.380 }, 00:29:44.380 "driver_specific": { 00:29:44.380 "compress": { 00:29:44.380 "name": "COMP_lvs0/lv0", 00:29:44.380 "base_bdev_name": "4657f5d3-bcba-4e60-b177-c60f69713a4b" 00:29:44.380 } 00:29:44.380 } 00:29:44.380 } 00:29:44.380 ] 00:29:44.380 16:06:04 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:29:44.380 16:06:04 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:44.380 [2024-07-12 16:06:04.803079] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa3181b1350 PMD being used: compress_qat 00:29:44.380 I/O targets: 00:29:44.380 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:29:44.380 00:29:44.380 00:29:44.380 CUnit - A unit testing framework for C - Version 2.1-3 00:29:44.380 http://cunit.sourceforge.net/ 00:29:44.380 00:29:44.380 00:29:44.380 Suite: bdevio tests on: COMP_lvs0/lv0 00:29:44.380 Test: blockdev write read block ...passed 00:29:44.380 Test: blockdev write zeroes read block ...passed 00:29:44.380 Test: blockdev write zeroes read no split ...passed 00:29:44.640 Test: blockdev write zeroes read split ...passed 00:29:44.640 Test: blockdev write zeroes read split partial ...passed 00:29:44.640 Test: blockdev reset ...[2024-07-12 16:06:04.924719] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:29:44.640 passed 00:29:44.640 Test: blockdev write read 8 blocks ...passed 00:29:44.640 Test: blockdev write read size > 128k ...passed 00:29:44.640 Test: blockdev write read invalid size ...passed 00:29:44.640 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:44.640 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:44.640 Test: blockdev write read max offset ...passed 00:29:44.640 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:44.640 Test: blockdev writev readv 8 blocks ...passed 00:29:44.640 Test: blockdev writev readv 30 x 1block ...passed 00:29:44.640 Test: blockdev writev readv block ...passed 00:29:44.640 Test: blockdev writev readv size > 128k ...passed 00:29:44.640 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:44.640 Test: blockdev comparev and writev ...passed 00:29:44.640 Test: blockdev nvme passthru rw ...passed 00:29:44.640 Test: blockdev nvme passthru vendor specific ...passed 00:29:44.640 Test: blockdev nvme admin passthru ...passed 00:29:44.640 Test: blockdev copy ...passed 00:29:44.640 00:29:44.640 Run Summary: Type Total Ran Passed Failed Inactive 00:29:44.640 suites 1 1 n/a 0 0 00:29:44.640 tests 23 23 23 0 0 00:29:44.640 asserts 130 130 130 0 n/a 00:29:44.640 00:29:44.640 Elapsed time = 0.337 seconds 00:29:44.640 0 00:29:44.640 16:06:04 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:29:44.640 16:06:04 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:44.899 16:06:05 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:45.160 16:06:05 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:29:45.160 16:06:05 compress_compdev -- compress/compress.sh@62 -- # killprocess 2696906 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 2696906 ']' 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 2696906 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2696906 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2696906' 00:29:45.160 killing process with pid 2696906 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@967 -- # kill 2696906 00:29:45.160 16:06:05 compress_compdev -- common/autotest_common.sh@972 -- # wait 2696906 00:29:47.703 16:06:07 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:29:47.703 16:06:07 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:29:47.703 00:29:47.703 real 0m49.384s 00:29:47.703 user 1m52.061s 00:29:47.703 sys 0m4.340s 00:29:47.704 16:06:07 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:47.704 16:06:07 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:47.704 ************************************ 00:29:47.704 END TEST compress_compdev 00:29:47.704 ************************************ 00:29:47.704 16:06:07 -- common/autotest_common.sh@1142 -- # return 0 00:29:47.704 16:06:07 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:47.704 16:06:07 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:29:47.704 16:06:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:47.704 16:06:07 -- common/autotest_common.sh@10 -- # set +x 00:29:47.704 ************************************ 00:29:47.704 START TEST compress_isal 00:29:47.704 ************************************ 00:29:47.704 16:06:07 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:47.704 * Looking for test storage... 00:29:47.704 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:47.704 16:06:08 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:47.704 16:06:08 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:47.704 16:06:08 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:47.704 16:06:08 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:47.704 16:06:08 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:47.704 16:06:08 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:47.704 16:06:08 compress_isal -- paths/export.sh@5 -- # export PATH 00:29:47.704 16:06:08 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@47 -- # : 0 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:47.704 16:06:08 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2698899 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2698899 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2698899 ']' 00:29:47.704 16:06:08 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:47.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:47.704 16:06:08 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:47.704 [2024-07-12 16:06:08.118521] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:29:47.704 [2024-07-12 16:06:08.118579] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2698899 ] 00:29:47.965 [2024-07-12 16:06:08.203309] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:47.965 [2024-07-12 16:06:08.303777] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:47.965 [2024-07-12 16:06:08.303839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:48.537 16:06:08 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:48.537 16:06:08 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:29:48.537 16:06:08 compress_isal -- compress/compress.sh@74 -- # create_vols 00:29:48.537 16:06:08 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:48.537 16:06:08 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:51.839 16:06:12 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@899 -- # local i 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:51.839 16:06:12 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:52.100 [ 00:29:52.100 { 00:29:52.100 "name": "Nvme0n1", 00:29:52.100 "aliases": [ 00:29:52.100 "ca66b5f0-26f8-4a6c-83cb-24dd0c8af65a" 00:29:52.100 ], 00:29:52.100 "product_name": "NVMe disk", 00:29:52.100 "block_size": 512, 00:29:52.100 "num_blocks": 3907029168, 00:29:52.100 "uuid": "ca66b5f0-26f8-4a6c-83cb-24dd0c8af65a", 00:29:52.100 "assigned_rate_limits": { 00:29:52.100 "rw_ios_per_sec": 0, 00:29:52.100 "rw_mbytes_per_sec": 0, 00:29:52.100 "r_mbytes_per_sec": 0, 00:29:52.100 "w_mbytes_per_sec": 0 00:29:52.100 }, 00:29:52.100 "claimed": false, 00:29:52.100 "zoned": false, 00:29:52.100 "supported_io_types": { 00:29:52.100 "read": true, 00:29:52.100 "write": true, 00:29:52.100 "unmap": true, 00:29:52.100 "flush": true, 00:29:52.100 "reset": true, 00:29:52.100 "nvme_admin": true, 00:29:52.100 "nvme_io": true, 00:29:52.100 "nvme_io_md": false, 00:29:52.100 "write_zeroes": true, 00:29:52.100 "zcopy": false, 00:29:52.100 "get_zone_info": false, 00:29:52.100 "zone_management": false, 00:29:52.100 "zone_append": false, 00:29:52.100 "compare": false, 00:29:52.100 "compare_and_write": false, 00:29:52.100 "abort": true, 00:29:52.100 "seek_hole": false, 00:29:52.100 "seek_data": false, 00:29:52.100 "copy": false, 00:29:52.100 "nvme_iov_md": false 00:29:52.100 }, 00:29:52.100 "driver_specific": { 00:29:52.100 "nvme": [ 00:29:52.100 { 00:29:52.100 "pci_address": "0000:65:00.0", 00:29:52.100 "trid": { 00:29:52.100 "trtype": "PCIe", 00:29:52.100 "traddr": "0000:65:00.0" 00:29:52.100 }, 00:29:52.100 "ctrlr_data": { 00:29:52.100 "cntlid": 0, 00:29:52.100 "vendor_id": "0x8086", 00:29:52.100 "model_number": "INTEL SSDPE2KX020T8", 00:29:52.100 "serial_number": "PHLJ9512038S2P0BGN", 00:29:52.100 "firmware_revision": "VDV10184", 00:29:52.100 "oacs": { 00:29:52.100 "security": 0, 00:29:52.100 "format": 1, 00:29:52.100 "firmware": 1, 00:29:52.100 "ns_manage": 1 00:29:52.100 }, 00:29:52.100 "multi_ctrlr": false, 00:29:52.100 "ana_reporting": false 00:29:52.100 }, 00:29:52.100 "vs": { 00:29:52.100 "nvme_version": "1.2" 00:29:52.100 }, 00:29:52.100 "ns_data": { 00:29:52.100 "id": 1, 00:29:52.100 "can_share": false 00:29:52.100 } 00:29:52.100 } 00:29:52.100 ], 00:29:52.100 "mp_policy": "active_passive" 00:29:52.100 } 00:29:52.100 } 00:29:52.100 ] 00:29:52.100 16:06:12 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:29:52.100 16:06:12 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:53.521 43ec0fb1-eaec-415b-98a7-a861aebc9339 00:29:53.521 16:06:13 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:53.784 497966ba-de76-425e-88f7-b3afb9a676c2 00:29:53.784 16:06:14 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@899 -- # local i 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:53.784 16:06:14 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:54.044 16:06:14 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:54.044 [ 00:29:54.044 { 00:29:54.044 "name": "497966ba-de76-425e-88f7-b3afb9a676c2", 00:29:54.044 "aliases": [ 00:29:54.044 "lvs0/lv0" 00:29:54.044 ], 00:29:54.044 "product_name": "Logical Volume", 00:29:54.044 "block_size": 512, 00:29:54.044 "num_blocks": 204800, 00:29:54.044 "uuid": "497966ba-de76-425e-88f7-b3afb9a676c2", 00:29:54.044 "assigned_rate_limits": { 00:29:54.044 "rw_ios_per_sec": 0, 00:29:54.044 "rw_mbytes_per_sec": 0, 00:29:54.044 "r_mbytes_per_sec": 0, 00:29:54.044 "w_mbytes_per_sec": 0 00:29:54.044 }, 00:29:54.044 "claimed": false, 00:29:54.044 "zoned": false, 00:29:54.044 "supported_io_types": { 00:29:54.044 "read": true, 00:29:54.044 "write": true, 00:29:54.044 "unmap": true, 00:29:54.044 "flush": false, 00:29:54.044 "reset": true, 00:29:54.044 "nvme_admin": false, 00:29:54.044 "nvme_io": false, 00:29:54.044 "nvme_io_md": false, 00:29:54.044 "write_zeroes": true, 00:29:54.044 "zcopy": false, 00:29:54.044 "get_zone_info": false, 00:29:54.044 "zone_management": false, 00:29:54.044 "zone_append": false, 00:29:54.044 "compare": false, 00:29:54.045 "compare_and_write": false, 00:29:54.045 "abort": false, 00:29:54.045 "seek_hole": true, 00:29:54.045 "seek_data": true, 00:29:54.045 "copy": false, 00:29:54.045 "nvme_iov_md": false 00:29:54.045 }, 00:29:54.045 "driver_specific": { 00:29:54.045 "lvol": { 00:29:54.045 "lvol_store_uuid": "43ec0fb1-eaec-415b-98a7-a861aebc9339", 00:29:54.045 "base_bdev": "Nvme0n1", 00:29:54.045 "thin_provision": true, 00:29:54.045 "num_allocated_clusters": 0, 00:29:54.045 "snapshot": false, 00:29:54.045 "clone": false, 00:29:54.045 "esnap_clone": false 00:29:54.045 } 00:29:54.045 } 00:29:54.045 } 00:29:54.045 ] 00:29:54.045 16:06:14 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:29:54.045 16:06:14 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:54.045 16:06:14 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:54.305 [2024-07-12 16:06:14.643255] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:54.305 COMP_lvs0/lv0 00:29:54.305 16:06:14 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@899 -- # local i 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:54.305 16:06:14 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:54.566 16:06:14 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:54.826 [ 00:29:54.826 { 00:29:54.826 "name": "COMP_lvs0/lv0", 00:29:54.826 "aliases": [ 00:29:54.826 "e620e7cd-b6c2-5e54-a1ae-ae6ca6694dba" 00:29:54.826 ], 00:29:54.826 "product_name": "compress", 00:29:54.826 "block_size": 512, 00:29:54.826 "num_blocks": 200704, 00:29:54.826 "uuid": "e620e7cd-b6c2-5e54-a1ae-ae6ca6694dba", 00:29:54.826 "assigned_rate_limits": { 00:29:54.826 "rw_ios_per_sec": 0, 00:29:54.826 "rw_mbytes_per_sec": 0, 00:29:54.826 "r_mbytes_per_sec": 0, 00:29:54.826 "w_mbytes_per_sec": 0 00:29:54.826 }, 00:29:54.826 "claimed": false, 00:29:54.826 "zoned": false, 00:29:54.826 "supported_io_types": { 00:29:54.826 "read": true, 00:29:54.826 "write": true, 00:29:54.826 "unmap": false, 00:29:54.826 "flush": false, 00:29:54.826 "reset": false, 00:29:54.826 "nvme_admin": false, 00:29:54.826 "nvme_io": false, 00:29:54.826 "nvme_io_md": false, 00:29:54.826 "write_zeroes": true, 00:29:54.826 "zcopy": false, 00:29:54.826 "get_zone_info": false, 00:29:54.826 "zone_management": false, 00:29:54.826 "zone_append": false, 00:29:54.826 "compare": false, 00:29:54.826 "compare_and_write": false, 00:29:54.826 "abort": false, 00:29:54.826 "seek_hole": false, 00:29:54.826 "seek_data": false, 00:29:54.826 "copy": false, 00:29:54.826 "nvme_iov_md": false 00:29:54.826 }, 00:29:54.826 "driver_specific": { 00:29:54.826 "compress": { 00:29:54.826 "name": "COMP_lvs0/lv0", 00:29:54.827 "base_bdev_name": "497966ba-de76-425e-88f7-b3afb9a676c2" 00:29:54.827 } 00:29:54.827 } 00:29:54.827 } 00:29:54.827 ] 00:29:54.827 16:06:15 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:29:54.827 16:06:15 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:54.827 Running I/O for 3 seconds... 00:29:58.125 00:29:58.125 Latency(us) 00:29:58.125 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.125 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:58.125 Verification LBA range: start 0x0 length 0x3100 00:29:58.125 COMP_lvs0/lv0 : 3.02 1094.62 4.28 0.00 0.00 29089.23 806.60 29642.44 00:29:58.125 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:58.125 Verification LBA range: start 0x3100 length 0x3100 00:29:58.125 COMP_lvs0/lv0 : 3.02 1107.64 4.33 0.00 0.00 28732.16 1064.96 29037.49 00:29:58.125 =================================================================================================================== 00:29:58.125 Total : 2202.26 8.60 0.00 0.00 28909.73 806.60 29642.44 00:29:58.125 0 00:29:58.125 16:06:18 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:58.125 16:06:18 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:58.125 16:06:18 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:58.385 16:06:18 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:58.385 16:06:18 compress_isal -- compress/compress.sh@78 -- # killprocess 2698899 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2698899 ']' 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2698899 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@953 -- # uname 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2698899 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2698899' 00:29:58.385 killing process with pid 2698899 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@967 -- # kill 2698899 00:29:58.385 Received shutdown signal, test time was about 3.000000 seconds 00:29:58.385 00:29:58.385 Latency(us) 00:29:58.385 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:58.385 =================================================================================================================== 00:29:58.385 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:58.385 16:06:18 compress_isal -- common/autotest_common.sh@972 -- # wait 2698899 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2701343 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2701343 00:30:00.924 16:06:21 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2701343 ']' 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:00.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:00.924 16:06:21 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:00.924 [2024-07-12 16:06:21.276776] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:00.924 [2024-07-12 16:06:21.276846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2701343 ] 00:30:00.924 [2024-07-12 16:06:21.362155] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:01.183 [2024-07-12 16:06:21.465608] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:01.183 [2024-07-12 16:06:21.465613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:01.752 16:06:22 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:01.752 16:06:22 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:01.752 16:06:22 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:30:01.752 16:06:22 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:01.752 16:06:22 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:05.048 16:06:25 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:05.048 16:06:25 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:05.308 [ 00:30:05.308 { 00:30:05.308 "name": "Nvme0n1", 00:30:05.308 "aliases": [ 00:30:05.308 "0fb20e04-8a88-48ba-9a16-83a3d856465d" 00:30:05.308 ], 00:30:05.308 "product_name": "NVMe disk", 00:30:05.308 "block_size": 512, 00:30:05.308 "num_blocks": 3907029168, 00:30:05.308 "uuid": "0fb20e04-8a88-48ba-9a16-83a3d856465d", 00:30:05.308 "assigned_rate_limits": { 00:30:05.308 "rw_ios_per_sec": 0, 00:30:05.308 "rw_mbytes_per_sec": 0, 00:30:05.308 "r_mbytes_per_sec": 0, 00:30:05.308 "w_mbytes_per_sec": 0 00:30:05.308 }, 00:30:05.308 "claimed": false, 00:30:05.308 "zoned": false, 00:30:05.308 "supported_io_types": { 00:30:05.309 "read": true, 00:30:05.309 "write": true, 00:30:05.309 "unmap": true, 00:30:05.309 "flush": true, 00:30:05.309 "reset": true, 00:30:05.309 "nvme_admin": true, 00:30:05.309 "nvme_io": true, 00:30:05.309 "nvme_io_md": false, 00:30:05.309 "write_zeroes": true, 00:30:05.309 "zcopy": false, 00:30:05.309 "get_zone_info": false, 00:30:05.309 "zone_management": false, 00:30:05.309 "zone_append": false, 00:30:05.309 "compare": false, 00:30:05.309 "compare_and_write": false, 00:30:05.309 "abort": true, 00:30:05.309 "seek_hole": false, 00:30:05.309 "seek_data": false, 00:30:05.309 "copy": false, 00:30:05.309 "nvme_iov_md": false 00:30:05.309 }, 00:30:05.309 "driver_specific": { 00:30:05.309 "nvme": [ 00:30:05.309 { 00:30:05.309 "pci_address": "0000:65:00.0", 00:30:05.309 "trid": { 00:30:05.309 "trtype": "PCIe", 00:30:05.309 "traddr": "0000:65:00.0" 00:30:05.309 }, 00:30:05.309 "ctrlr_data": { 00:30:05.309 "cntlid": 0, 00:30:05.309 "vendor_id": "0x8086", 00:30:05.309 "model_number": "INTEL SSDPE2KX020T8", 00:30:05.309 "serial_number": "PHLJ9512038S2P0BGN", 00:30:05.309 "firmware_revision": "VDV10184", 00:30:05.309 "oacs": { 00:30:05.309 "security": 0, 00:30:05.309 "format": 1, 00:30:05.309 "firmware": 1, 00:30:05.309 "ns_manage": 1 00:30:05.309 }, 00:30:05.309 "multi_ctrlr": false, 00:30:05.309 "ana_reporting": false 00:30:05.309 }, 00:30:05.309 "vs": { 00:30:05.309 "nvme_version": "1.2" 00:30:05.309 }, 00:30:05.309 "ns_data": { 00:30:05.309 "id": 1, 00:30:05.309 "can_share": false 00:30:05.309 } 00:30:05.309 } 00:30:05.309 ], 00:30:05.309 "mp_policy": "active_passive" 00:30:05.309 } 00:30:05.309 } 00:30:05.309 ] 00:30:05.309 16:06:25 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:05.309 16:06:25 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:06.692 a691513f-679f-4cb4-8bc9-874622cad07c 00:30:06.692 16:06:26 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:06.692 51a0bd5b-1790-4f7c-a4d9-d6a9a9f76e93 00:30:06.952 16:06:27 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:06.952 16:06:27 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:07.213 [ 00:30:07.213 { 00:30:07.213 "name": "51a0bd5b-1790-4f7c-a4d9-d6a9a9f76e93", 00:30:07.213 "aliases": [ 00:30:07.213 "lvs0/lv0" 00:30:07.213 ], 00:30:07.213 "product_name": "Logical Volume", 00:30:07.213 "block_size": 512, 00:30:07.213 "num_blocks": 204800, 00:30:07.213 "uuid": "51a0bd5b-1790-4f7c-a4d9-d6a9a9f76e93", 00:30:07.213 "assigned_rate_limits": { 00:30:07.213 "rw_ios_per_sec": 0, 00:30:07.213 "rw_mbytes_per_sec": 0, 00:30:07.213 "r_mbytes_per_sec": 0, 00:30:07.213 "w_mbytes_per_sec": 0 00:30:07.213 }, 00:30:07.213 "claimed": false, 00:30:07.213 "zoned": false, 00:30:07.213 "supported_io_types": { 00:30:07.213 "read": true, 00:30:07.213 "write": true, 00:30:07.213 "unmap": true, 00:30:07.213 "flush": false, 00:30:07.213 "reset": true, 00:30:07.213 "nvme_admin": false, 00:30:07.213 "nvme_io": false, 00:30:07.213 "nvme_io_md": false, 00:30:07.213 "write_zeroes": true, 00:30:07.213 "zcopy": false, 00:30:07.213 "get_zone_info": false, 00:30:07.213 "zone_management": false, 00:30:07.213 "zone_append": false, 00:30:07.213 "compare": false, 00:30:07.213 "compare_and_write": false, 00:30:07.213 "abort": false, 00:30:07.213 "seek_hole": true, 00:30:07.213 "seek_data": true, 00:30:07.213 "copy": false, 00:30:07.213 "nvme_iov_md": false 00:30:07.213 }, 00:30:07.213 "driver_specific": { 00:30:07.213 "lvol": { 00:30:07.213 "lvol_store_uuid": "a691513f-679f-4cb4-8bc9-874622cad07c", 00:30:07.213 "base_bdev": "Nvme0n1", 00:30:07.213 "thin_provision": true, 00:30:07.213 "num_allocated_clusters": 0, 00:30:07.213 "snapshot": false, 00:30:07.213 "clone": false, 00:30:07.213 "esnap_clone": false 00:30:07.213 } 00:30:07.213 } 00:30:07.213 } 00:30:07.213 ] 00:30:07.213 16:06:27 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:07.213 16:06:27 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:30:07.213 16:06:27 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:30:07.474 [2024-07-12 16:06:27.784989] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:07.474 COMP_lvs0/lv0 00:30:07.474 16:06:27 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:07.474 16:06:27 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:07.734 16:06:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:07.994 [ 00:30:07.994 { 00:30:07.994 "name": "COMP_lvs0/lv0", 00:30:07.994 "aliases": [ 00:30:07.994 "42fff1b9-7af0-5c22-8a60-8db772b72d9c" 00:30:07.994 ], 00:30:07.994 "product_name": "compress", 00:30:07.994 "block_size": 512, 00:30:07.994 "num_blocks": 200704, 00:30:07.994 "uuid": "42fff1b9-7af0-5c22-8a60-8db772b72d9c", 00:30:07.994 "assigned_rate_limits": { 00:30:07.994 "rw_ios_per_sec": 0, 00:30:07.994 "rw_mbytes_per_sec": 0, 00:30:07.994 "r_mbytes_per_sec": 0, 00:30:07.994 "w_mbytes_per_sec": 0 00:30:07.994 }, 00:30:07.994 "claimed": false, 00:30:07.994 "zoned": false, 00:30:07.994 "supported_io_types": { 00:30:07.994 "read": true, 00:30:07.994 "write": true, 00:30:07.994 "unmap": false, 00:30:07.994 "flush": false, 00:30:07.994 "reset": false, 00:30:07.994 "nvme_admin": false, 00:30:07.994 "nvme_io": false, 00:30:07.994 "nvme_io_md": false, 00:30:07.994 "write_zeroes": true, 00:30:07.994 "zcopy": false, 00:30:07.994 "get_zone_info": false, 00:30:07.994 "zone_management": false, 00:30:07.994 "zone_append": false, 00:30:07.994 "compare": false, 00:30:07.994 "compare_and_write": false, 00:30:07.994 "abort": false, 00:30:07.994 "seek_hole": false, 00:30:07.994 "seek_data": false, 00:30:07.994 "copy": false, 00:30:07.994 "nvme_iov_md": false 00:30:07.994 }, 00:30:07.994 "driver_specific": { 00:30:07.994 "compress": { 00:30:07.994 "name": "COMP_lvs0/lv0", 00:30:07.994 "base_bdev_name": "51a0bd5b-1790-4f7c-a4d9-d6a9a9f76e93" 00:30:07.994 } 00:30:07.994 } 00:30:07.994 } 00:30:07.994 ] 00:30:07.994 16:06:28 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:07.994 16:06:28 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:07.994 Running I/O for 3 seconds... 00:30:11.292 00:30:11.292 Latency(us) 00:30:11.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.292 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:11.292 Verification LBA range: start 0x0 length 0x3100 00:30:11.292 COMP_lvs0/lv0 : 3.02 1093.41 4.27 0.00 0.00 29147.53 601.80 31860.58 00:30:11.292 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:11.292 Verification LBA range: start 0x3100 length 0x3100 00:30:11.292 COMP_lvs0/lv0 : 3.02 1101.00 4.30 0.00 0.00 28879.31 332.41 30852.33 00:30:11.292 =================================================================================================================== 00:30:11.292 Total : 2194.41 8.57 0.00 0.00 29012.94 332.41 31860.58 00:30:11.292 0 00:30:11.292 16:06:31 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:11.292 16:06:31 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:11.292 16:06:31 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:11.552 16:06:31 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:11.552 16:06:31 compress_isal -- compress/compress.sh@78 -- # killprocess 2701343 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2701343 ']' 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2701343 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2701343 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2701343' 00:30:11.552 killing process with pid 2701343 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@967 -- # kill 2701343 00:30:11.552 Received shutdown signal, test time was about 3.000000 seconds 00:30:11.552 00:30:11.552 Latency(us) 00:30:11.552 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:11.552 =================================================================================================================== 00:30:11.552 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:11.552 16:06:31 compress_isal -- common/autotest_common.sh@972 -- # wait 2701343 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2703396 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:14.093 16:06:34 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2703396 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2703396 ']' 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:14.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:14.093 16:06:34 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:14.093 [2024-07-12 16:06:34.274412] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:14.093 [2024-07-12 16:06:34.274491] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2703396 ] 00:30:14.093 [2024-07-12 16:06:34.361450] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:14.093 [2024-07-12 16:06:34.463863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:14.093 [2024-07-12 16:06:34.463869] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:15.035 16:06:35 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:15.035 16:06:35 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:15.035 16:06:35 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:15.035 16:06:35 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:15.035 16:06:35 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:18.337 16:06:38 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:18.337 [ 00:30:18.337 { 00:30:18.337 "name": "Nvme0n1", 00:30:18.337 "aliases": [ 00:30:18.337 "c86d9916-4205-4deb-9690-931eccf36e8a" 00:30:18.337 ], 00:30:18.337 "product_name": "NVMe disk", 00:30:18.337 "block_size": 512, 00:30:18.337 "num_blocks": 3907029168, 00:30:18.337 "uuid": "c86d9916-4205-4deb-9690-931eccf36e8a", 00:30:18.337 "assigned_rate_limits": { 00:30:18.337 "rw_ios_per_sec": 0, 00:30:18.337 "rw_mbytes_per_sec": 0, 00:30:18.337 "r_mbytes_per_sec": 0, 00:30:18.337 "w_mbytes_per_sec": 0 00:30:18.337 }, 00:30:18.337 "claimed": false, 00:30:18.337 "zoned": false, 00:30:18.337 "supported_io_types": { 00:30:18.337 "read": true, 00:30:18.337 "write": true, 00:30:18.337 "unmap": true, 00:30:18.337 "flush": true, 00:30:18.337 "reset": true, 00:30:18.337 "nvme_admin": true, 00:30:18.337 "nvme_io": true, 00:30:18.337 "nvme_io_md": false, 00:30:18.337 "write_zeroes": true, 00:30:18.337 "zcopy": false, 00:30:18.337 "get_zone_info": false, 00:30:18.337 "zone_management": false, 00:30:18.337 "zone_append": false, 00:30:18.337 "compare": false, 00:30:18.337 "compare_and_write": false, 00:30:18.337 "abort": true, 00:30:18.337 "seek_hole": false, 00:30:18.337 "seek_data": false, 00:30:18.337 "copy": false, 00:30:18.337 "nvme_iov_md": false 00:30:18.337 }, 00:30:18.337 "driver_specific": { 00:30:18.337 "nvme": [ 00:30:18.337 { 00:30:18.337 "pci_address": "0000:65:00.0", 00:30:18.337 "trid": { 00:30:18.337 "trtype": "PCIe", 00:30:18.337 "traddr": "0000:65:00.0" 00:30:18.337 }, 00:30:18.337 "ctrlr_data": { 00:30:18.337 "cntlid": 0, 00:30:18.337 "vendor_id": "0x8086", 00:30:18.337 "model_number": "INTEL SSDPE2KX020T8", 00:30:18.337 "serial_number": "PHLJ9512038S2P0BGN", 00:30:18.337 "firmware_revision": "VDV10184", 00:30:18.337 "oacs": { 00:30:18.337 "security": 0, 00:30:18.337 "format": 1, 00:30:18.337 "firmware": 1, 00:30:18.337 "ns_manage": 1 00:30:18.337 }, 00:30:18.337 "multi_ctrlr": false, 00:30:18.337 "ana_reporting": false 00:30:18.337 }, 00:30:18.337 "vs": { 00:30:18.337 "nvme_version": "1.2" 00:30:18.337 }, 00:30:18.337 "ns_data": { 00:30:18.337 "id": 1, 00:30:18.337 "can_share": false 00:30:18.337 } 00:30:18.337 } 00:30:18.337 ], 00:30:18.337 "mp_policy": "active_passive" 00:30:18.337 } 00:30:18.337 } 00:30:18.337 ] 00:30:18.337 16:06:38 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:18.337 16:06:38 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:19.759 a579f6f6-a1fb-48a6-9819-340d57a38389 00:30:19.759 16:06:40 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:20.019 393f6f41-5898-4802-947d-8dfac32e8227 00:30:20.019 16:06:40 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:20.019 16:06:40 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:20.280 16:06:40 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:20.851 [ 00:30:20.851 { 00:30:20.851 "name": "393f6f41-5898-4802-947d-8dfac32e8227", 00:30:20.851 "aliases": [ 00:30:20.851 "lvs0/lv0" 00:30:20.851 ], 00:30:20.851 "product_name": "Logical Volume", 00:30:20.851 "block_size": 512, 00:30:20.851 "num_blocks": 204800, 00:30:20.851 "uuid": "393f6f41-5898-4802-947d-8dfac32e8227", 00:30:20.851 "assigned_rate_limits": { 00:30:20.851 "rw_ios_per_sec": 0, 00:30:20.851 "rw_mbytes_per_sec": 0, 00:30:20.851 "r_mbytes_per_sec": 0, 00:30:20.851 "w_mbytes_per_sec": 0 00:30:20.851 }, 00:30:20.851 "claimed": false, 00:30:20.851 "zoned": false, 00:30:20.851 "supported_io_types": { 00:30:20.851 "read": true, 00:30:20.851 "write": true, 00:30:20.851 "unmap": true, 00:30:20.851 "flush": false, 00:30:20.851 "reset": true, 00:30:20.851 "nvme_admin": false, 00:30:20.851 "nvme_io": false, 00:30:20.851 "nvme_io_md": false, 00:30:20.851 "write_zeroes": true, 00:30:20.851 "zcopy": false, 00:30:20.851 "get_zone_info": false, 00:30:20.851 "zone_management": false, 00:30:20.851 "zone_append": false, 00:30:20.851 "compare": false, 00:30:20.851 "compare_and_write": false, 00:30:20.851 "abort": false, 00:30:20.851 "seek_hole": true, 00:30:20.851 "seek_data": true, 00:30:20.851 "copy": false, 00:30:20.851 "nvme_iov_md": false 00:30:20.851 }, 00:30:20.851 "driver_specific": { 00:30:20.851 "lvol": { 00:30:20.851 "lvol_store_uuid": "a579f6f6-a1fb-48a6-9819-340d57a38389", 00:30:20.851 "base_bdev": "Nvme0n1", 00:30:20.851 "thin_provision": true, 00:30:20.851 "num_allocated_clusters": 0, 00:30:20.851 "snapshot": false, 00:30:20.851 "clone": false, 00:30:20.851 "esnap_clone": false 00:30:20.851 } 00:30:20.851 } 00:30:20.851 } 00:30:20.851 ] 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:20.851 16:06:41 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:20.851 16:06:41 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:20.851 [2024-07-12 16:06:41.215371] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:20.851 COMP_lvs0/lv0 00:30:20.851 16:06:41 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:20.851 16:06:41 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:21.111 16:06:41 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:21.372 [ 00:30:21.372 { 00:30:21.372 "name": "COMP_lvs0/lv0", 00:30:21.372 "aliases": [ 00:30:21.372 "1121d523-61ca-5397-a897-64d1f0267dae" 00:30:21.372 ], 00:30:21.372 "product_name": "compress", 00:30:21.372 "block_size": 4096, 00:30:21.372 "num_blocks": 25088, 00:30:21.372 "uuid": "1121d523-61ca-5397-a897-64d1f0267dae", 00:30:21.372 "assigned_rate_limits": { 00:30:21.372 "rw_ios_per_sec": 0, 00:30:21.372 "rw_mbytes_per_sec": 0, 00:30:21.372 "r_mbytes_per_sec": 0, 00:30:21.372 "w_mbytes_per_sec": 0 00:30:21.372 }, 00:30:21.372 "claimed": false, 00:30:21.372 "zoned": false, 00:30:21.372 "supported_io_types": { 00:30:21.372 "read": true, 00:30:21.372 "write": true, 00:30:21.372 "unmap": false, 00:30:21.372 "flush": false, 00:30:21.372 "reset": false, 00:30:21.372 "nvme_admin": false, 00:30:21.372 "nvme_io": false, 00:30:21.372 "nvme_io_md": false, 00:30:21.372 "write_zeroes": true, 00:30:21.372 "zcopy": false, 00:30:21.372 "get_zone_info": false, 00:30:21.372 "zone_management": false, 00:30:21.372 "zone_append": false, 00:30:21.372 "compare": false, 00:30:21.372 "compare_and_write": false, 00:30:21.372 "abort": false, 00:30:21.372 "seek_hole": false, 00:30:21.372 "seek_data": false, 00:30:21.372 "copy": false, 00:30:21.372 "nvme_iov_md": false 00:30:21.372 }, 00:30:21.372 "driver_specific": { 00:30:21.372 "compress": { 00:30:21.372 "name": "COMP_lvs0/lv0", 00:30:21.372 "base_bdev_name": "393f6f41-5898-4802-947d-8dfac32e8227" 00:30:21.372 } 00:30:21.372 } 00:30:21.372 } 00:30:21.372 ] 00:30:21.372 16:06:41 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:21.372 16:06:41 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:21.372 Running I/O for 3 seconds... 00:30:24.670 00:30:24.670 Latency(us) 00:30:24.670 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:24.670 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:24.670 Verification LBA range: start 0x0 length 0x3100 00:30:24.670 COMP_lvs0/lv0 : 3.02 1133.09 4.43 0.00 0.00 28139.15 185.11 29037.49 00:30:24.670 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:24.670 Verification LBA range: start 0x3100 length 0x3100 00:30:24.670 COMP_lvs0/lv0 : 3.01 1139.34 4.45 0.00 0.00 27924.28 275.69 29440.79 00:30:24.670 =================================================================================================================== 00:30:24.670 Total : 2272.44 8.88 0.00 0.00 28031.48 185.11 29440.79 00:30:24.670 0 00:30:24.670 16:06:44 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:24.670 16:06:44 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:24.670 16:06:45 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:24.930 16:06:45 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:24.930 16:06:45 compress_isal -- compress/compress.sh@78 -- # killprocess 2703396 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2703396 ']' 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2703396 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2703396 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2703396' 00:30:24.930 killing process with pid 2703396 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@967 -- # kill 2703396 00:30:24.930 Received shutdown signal, test time was about 3.000000 seconds 00:30:24.930 00:30:24.930 Latency(us) 00:30:24.930 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:24.930 =================================================================================================================== 00:30:24.930 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:24.930 16:06:45 compress_isal -- common/autotest_common.sh@972 -- # wait 2703396 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2705541 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2705541 00:30:27.473 16:06:47 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 2705541 ']' 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:27.473 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:27.473 16:06:47 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:27.473 [2024-07-12 16:06:47.762837] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:27.473 [2024-07-12 16:06:47.762905] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2705541 ] 00:30:27.473 [2024-07-12 16:06:47.857261] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:27.734 [2024-07-12 16:06:47.953245] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:27.734 [2024-07-12 16:06:47.953396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:27.734 [2024-07-12 16:06:47.953397] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:28.306 16:06:48 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:28.306 16:06:48 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:30:28.306 16:06:48 compress_isal -- compress/compress.sh@58 -- # create_vols 00:30:28.306 16:06:48 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:28.306 16:06:48 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:31.606 16:06:51 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:31.606 16:06:51 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:31.866 [ 00:30:31.866 { 00:30:31.866 "name": "Nvme0n1", 00:30:31.866 "aliases": [ 00:30:31.866 "fbbdc006-921d-4c65-aad8-3280b6ad9b13" 00:30:31.866 ], 00:30:31.866 "product_name": "NVMe disk", 00:30:31.866 "block_size": 512, 00:30:31.866 "num_blocks": 3907029168, 00:30:31.866 "uuid": "fbbdc006-921d-4c65-aad8-3280b6ad9b13", 00:30:31.866 "assigned_rate_limits": { 00:30:31.866 "rw_ios_per_sec": 0, 00:30:31.866 "rw_mbytes_per_sec": 0, 00:30:31.866 "r_mbytes_per_sec": 0, 00:30:31.866 "w_mbytes_per_sec": 0 00:30:31.866 }, 00:30:31.866 "claimed": false, 00:30:31.866 "zoned": false, 00:30:31.866 "supported_io_types": { 00:30:31.866 "read": true, 00:30:31.866 "write": true, 00:30:31.866 "unmap": true, 00:30:31.866 "flush": true, 00:30:31.866 "reset": true, 00:30:31.866 "nvme_admin": true, 00:30:31.866 "nvme_io": true, 00:30:31.866 "nvme_io_md": false, 00:30:31.866 "write_zeroes": true, 00:30:31.866 "zcopy": false, 00:30:31.866 "get_zone_info": false, 00:30:31.866 "zone_management": false, 00:30:31.866 "zone_append": false, 00:30:31.866 "compare": false, 00:30:31.866 "compare_and_write": false, 00:30:31.866 "abort": true, 00:30:31.866 "seek_hole": false, 00:30:31.866 "seek_data": false, 00:30:31.866 "copy": false, 00:30:31.866 "nvme_iov_md": false 00:30:31.866 }, 00:30:31.866 "driver_specific": { 00:30:31.866 "nvme": [ 00:30:31.866 { 00:30:31.866 "pci_address": "0000:65:00.0", 00:30:31.866 "trid": { 00:30:31.866 "trtype": "PCIe", 00:30:31.866 "traddr": "0000:65:00.0" 00:30:31.866 }, 00:30:31.866 "ctrlr_data": { 00:30:31.866 "cntlid": 0, 00:30:31.866 "vendor_id": "0x8086", 00:30:31.866 "model_number": "INTEL SSDPE2KX020T8", 00:30:31.866 "serial_number": "PHLJ9512038S2P0BGN", 00:30:31.866 "firmware_revision": "VDV10184", 00:30:31.866 "oacs": { 00:30:31.866 "security": 0, 00:30:31.866 "format": 1, 00:30:31.866 "firmware": 1, 00:30:31.866 "ns_manage": 1 00:30:31.866 }, 00:30:31.866 "multi_ctrlr": false, 00:30:31.866 "ana_reporting": false 00:30:31.866 }, 00:30:31.866 "vs": { 00:30:31.866 "nvme_version": "1.2" 00:30:31.866 }, 00:30:31.866 "ns_data": { 00:30:31.866 "id": 1, 00:30:31.866 "can_share": false 00:30:31.866 } 00:30:31.866 } 00:30:31.866 ], 00:30:31.866 "mp_policy": "active_passive" 00:30:31.866 } 00:30:31.866 } 00:30:31.866 ] 00:30:31.866 16:06:52 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:31.866 16:06:52 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:33.298 1bc89160-1018-4d4b-a7aa-2dc4c91d0a3c 00:30:33.298 16:06:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:33.298 3a77d15e-5a42-4e8c-b67c-ea7f51257958 00:30:33.298 16:06:53 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:33.298 16:06:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:33.558 [ 00:30:33.558 { 00:30:33.558 "name": "3a77d15e-5a42-4e8c-b67c-ea7f51257958", 00:30:33.558 "aliases": [ 00:30:33.558 "lvs0/lv0" 00:30:33.558 ], 00:30:33.558 "product_name": "Logical Volume", 00:30:33.558 "block_size": 512, 00:30:33.558 "num_blocks": 204800, 00:30:33.558 "uuid": "3a77d15e-5a42-4e8c-b67c-ea7f51257958", 00:30:33.558 "assigned_rate_limits": { 00:30:33.558 "rw_ios_per_sec": 0, 00:30:33.558 "rw_mbytes_per_sec": 0, 00:30:33.558 "r_mbytes_per_sec": 0, 00:30:33.558 "w_mbytes_per_sec": 0 00:30:33.558 }, 00:30:33.558 "claimed": false, 00:30:33.558 "zoned": false, 00:30:33.558 "supported_io_types": { 00:30:33.558 "read": true, 00:30:33.558 "write": true, 00:30:33.558 "unmap": true, 00:30:33.558 "flush": false, 00:30:33.558 "reset": true, 00:30:33.558 "nvme_admin": false, 00:30:33.558 "nvme_io": false, 00:30:33.558 "nvme_io_md": false, 00:30:33.558 "write_zeroes": true, 00:30:33.558 "zcopy": false, 00:30:33.558 "get_zone_info": false, 00:30:33.558 "zone_management": false, 00:30:33.558 "zone_append": false, 00:30:33.558 "compare": false, 00:30:33.558 "compare_and_write": false, 00:30:33.558 "abort": false, 00:30:33.558 "seek_hole": true, 00:30:33.558 "seek_data": true, 00:30:33.558 "copy": false, 00:30:33.558 "nvme_iov_md": false 00:30:33.558 }, 00:30:33.558 "driver_specific": { 00:30:33.558 "lvol": { 00:30:33.558 "lvol_store_uuid": "1bc89160-1018-4d4b-a7aa-2dc4c91d0a3c", 00:30:33.558 "base_bdev": "Nvme0n1", 00:30:33.558 "thin_provision": true, 00:30:33.558 "num_allocated_clusters": 0, 00:30:33.558 "snapshot": false, 00:30:33.558 "clone": false, 00:30:33.558 "esnap_clone": false 00:30:33.558 } 00:30:33.558 } 00:30:33.558 } 00:30:33.558 ] 00:30:33.558 16:06:53 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:33.558 16:06:53 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:33.558 16:06:53 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:33.818 [2024-07-12 16:06:54.092700] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:33.818 COMP_lvs0/lv0 00:30:33.818 16:06:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:33.818 16:06:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:34.077 16:06:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:34.077 [ 00:30:34.077 { 00:30:34.077 "name": "COMP_lvs0/lv0", 00:30:34.077 "aliases": [ 00:30:34.077 "e3a245ee-a582-5961-a8f4-eef768837e7b" 00:30:34.077 ], 00:30:34.077 "product_name": "compress", 00:30:34.077 "block_size": 512, 00:30:34.077 "num_blocks": 200704, 00:30:34.077 "uuid": "e3a245ee-a582-5961-a8f4-eef768837e7b", 00:30:34.077 "assigned_rate_limits": { 00:30:34.077 "rw_ios_per_sec": 0, 00:30:34.077 "rw_mbytes_per_sec": 0, 00:30:34.077 "r_mbytes_per_sec": 0, 00:30:34.077 "w_mbytes_per_sec": 0 00:30:34.077 }, 00:30:34.077 "claimed": false, 00:30:34.077 "zoned": false, 00:30:34.077 "supported_io_types": { 00:30:34.077 "read": true, 00:30:34.077 "write": true, 00:30:34.077 "unmap": false, 00:30:34.077 "flush": false, 00:30:34.077 "reset": false, 00:30:34.077 "nvme_admin": false, 00:30:34.077 "nvme_io": false, 00:30:34.077 "nvme_io_md": false, 00:30:34.077 "write_zeroes": true, 00:30:34.077 "zcopy": false, 00:30:34.077 "get_zone_info": false, 00:30:34.077 "zone_management": false, 00:30:34.077 "zone_append": false, 00:30:34.077 "compare": false, 00:30:34.077 "compare_and_write": false, 00:30:34.077 "abort": false, 00:30:34.077 "seek_hole": false, 00:30:34.077 "seek_data": false, 00:30:34.077 "copy": false, 00:30:34.077 "nvme_iov_md": false 00:30:34.077 }, 00:30:34.077 "driver_specific": { 00:30:34.077 "compress": { 00:30:34.077 "name": "COMP_lvs0/lv0", 00:30:34.077 "base_bdev_name": "3a77d15e-5a42-4e8c-b67c-ea7f51257958" 00:30:34.077 } 00:30:34.077 } 00:30:34.077 } 00:30:34.077 ] 00:30:34.336 16:06:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:30:34.336 16:06:54 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:34.336 I/O targets: 00:30:34.336 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:34.336 00:30:34.336 00:30:34.336 CUnit - A unit testing framework for C - Version 2.1-3 00:30:34.336 http://cunit.sourceforge.net/ 00:30:34.336 00:30:34.336 00:30:34.336 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:34.336 Test: blockdev write read block ...passed 00:30:34.336 Test: blockdev write zeroes read block ...passed 00:30:34.336 Test: blockdev write zeroes read no split ...passed 00:30:34.336 Test: blockdev write zeroes read split ...passed 00:30:34.596 Test: blockdev write zeroes read split partial ...passed 00:30:34.596 Test: blockdev reset ...[2024-07-12 16:06:54.806025] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:34.596 passed 00:30:34.596 Test: blockdev write read 8 blocks ...passed 00:30:34.596 Test: blockdev write read size > 128k ...passed 00:30:34.596 Test: blockdev write read invalid size ...passed 00:30:34.596 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:34.596 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:34.596 Test: blockdev write read max offset ...passed 00:30:34.596 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:34.596 Test: blockdev writev readv 8 blocks ...passed 00:30:34.596 Test: blockdev writev readv 30 x 1block ...passed 00:30:34.596 Test: blockdev writev readv block ...passed 00:30:34.596 Test: blockdev writev readv size > 128k ...passed 00:30:34.596 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:34.596 Test: blockdev comparev and writev ...passed 00:30:34.596 Test: blockdev nvme passthru rw ...passed 00:30:34.596 Test: blockdev nvme passthru vendor specific ...passed 00:30:34.596 Test: blockdev nvme admin passthru ...passed 00:30:34.596 Test: blockdev copy ...passed 00:30:34.596 00:30:34.596 Run Summary: Type Total Ran Passed Failed Inactive 00:30:34.596 suites 1 1 n/a 0 0 00:30:34.596 tests 23 23 23 0 0 00:30:34.596 asserts 130 130 130 0 n/a 00:30:34.596 00:30:34.596 Elapsed time = 0.417 seconds 00:30:34.596 0 00:30:34.596 16:06:54 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:30:34.596 16:06:54 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:34.856 16:06:55 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:34.856 16:06:55 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:34.856 16:06:55 compress_isal -- compress/compress.sh@62 -- # killprocess 2705541 00:30:34.856 16:06:55 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 2705541 ']' 00:30:34.856 16:06:55 compress_isal -- common/autotest_common.sh@952 -- # kill -0 2705541 00:30:34.856 16:06:55 compress_isal -- common/autotest_common.sh@953 -- # uname 00:30:34.856 16:06:55 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:34.856 16:06:55 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2705541 00:30:35.116 16:06:55 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:35.116 16:06:55 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:35.116 16:06:55 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2705541' 00:30:35.116 killing process with pid 2705541 00:30:35.116 16:06:55 compress_isal -- common/autotest_common.sh@967 -- # kill 2705541 00:30:35.116 16:06:55 compress_isal -- common/autotest_common.sh@972 -- # wait 2705541 00:30:37.657 16:06:57 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:37.657 16:06:57 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:37.657 00:30:37.657 real 0m49.790s 00:30:37.657 user 1m53.691s 00:30:37.657 sys 0m3.488s 00:30:37.657 16:06:57 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:37.657 16:06:57 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:37.657 ************************************ 00:30:37.657 END TEST compress_isal 00:30:37.657 ************************************ 00:30:37.657 16:06:57 -- common/autotest_common.sh@1142 -- # return 0 00:30:37.657 16:06:57 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:30:37.657 16:06:57 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:30:37.657 16:06:57 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:37.657 16:06:57 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:37.657 16:06:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:37.657 16:06:57 -- common/autotest_common.sh@10 -- # set +x 00:30:37.657 ************************************ 00:30:37.657 START TEST blockdev_crypto_aesni 00:30:37.657 ************************************ 00:30:37.657 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:37.657 * Looking for test storage... 00:30:37.657 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2707281 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:37.657 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2707281 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 2707281 ']' 00:30:37.658 16:06:57 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:37.658 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:37.658 16:06:57 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:37.658 [2024-07-12 16:06:58.029407] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:37.658 [2024-07-12 16:06:58.029549] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707281 ] 00:30:37.918 [2024-07-12 16:06:58.173872] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:37.918 [2024-07-12 16:06:58.250441] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:38.488 16:06:58 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:38.488 16:06:58 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:30:38.488 16:06:58 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:38.488 16:06:58 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:30:38.488 16:06:58 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:30:38.488 16:06:58 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:38.488 16:06:58 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:38.488 [2024-07-12 16:06:58.816172] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:38.488 [2024-07-12 16:06:58.824204] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:38.488 [2024-07-12 16:06:58.832222] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:38.488 [2024-07-12 16:06:58.880479] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:41.039 true 00:30:41.039 true 00:30:41.039 true 00:30:41.039 true 00:30:41.039 Malloc0 00:30:41.039 Malloc1 00:30:41.039 Malloc2 00:30:41.039 Malloc3 00:30:41.039 [2024-07-12 16:07:01.156385] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:41.039 crypto_ram 00:30:41.039 [2024-07-12 16:07:01.164406] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:41.039 crypto_ram2 00:30:41.039 [2024-07-12 16:07:01.172426] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:41.039 crypto_ram3 00:30:41.039 [2024-07-12 16:07:01.180448] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:41.039 crypto_ram4 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.039 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.039 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:30:41.039 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.039 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.039 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.039 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "808ca6f0-1c77-5ed3-a85f-ee0d37844a02"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "808ca6f0-1c77-5ed3-a85f-ee0d37844a02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "56c4bd7e-8488-59a2-a885-31b1ef5b2175"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "56c4bd7e-8488-59a2-a885-31b1ef5b2175",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "656c2c5c-d6b5-5325-93a0-aafbaf374ca6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "656c2c5c-d6b5-5325-93a0-aafbaf374ca6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:41.040 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2707281 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 2707281 ']' 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 2707281 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2707281 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2707281' 00:30:41.040 killing process with pid 2707281 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 2707281 00:30:41.040 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 2707281 00:30:41.627 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:41.627 16:07:01 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:41.627 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:41.627 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:41.627 16:07:01 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:41.627 ************************************ 00:30:41.627 START TEST bdev_hello_world 00:30:41.627 ************************************ 00:30:41.627 16:07:01 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:41.627 [2024-07-12 16:07:01.891745] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:41.627 [2024-07-12 16:07:01.891872] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2707968 ] 00:30:41.627 [2024-07-12 16:07:02.033247] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:41.932 [2024-07-12 16:07:02.111056] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:41.932 [2024-07-12 16:07:02.132073] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:41.932 [2024-07-12 16:07:02.140097] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:41.932 [2024-07-12 16:07:02.148115] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:41.932 [2024-07-12 16:07:02.230222] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:44.472 [2024-07-12 16:07:04.393376] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:44.472 [2024-07-12 16:07:04.393425] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:44.472 [2024-07-12 16:07:04.393433] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:44.472 [2024-07-12 16:07:04.401393] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:44.472 [2024-07-12 16:07:04.401404] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:44.472 [2024-07-12 16:07:04.401410] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:44.472 [2024-07-12 16:07:04.409414] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:44.472 [2024-07-12 16:07:04.409425] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:44.472 [2024-07-12 16:07:04.409430] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:44.472 [2024-07-12 16:07:04.417433] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:44.472 [2024-07-12 16:07:04.417443] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:44.472 [2024-07-12 16:07:04.417448] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:44.472 [2024-07-12 16:07:04.478769] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:44.472 [2024-07-12 16:07:04.478798] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:44.472 [2024-07-12 16:07:04.478809] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:44.472 [2024-07-12 16:07:04.479832] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:44.472 [2024-07-12 16:07:04.479883] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:44.472 [2024-07-12 16:07:04.479892] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:44.472 [2024-07-12 16:07:04.479924] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:44.472 00:30:44.472 [2024-07-12 16:07:04.479935] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:44.472 00:30:44.472 real 0m2.917s 00:30:44.472 user 0m2.605s 00:30:44.472 sys 0m0.282s 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:44.472 ************************************ 00:30:44.472 END TEST bdev_hello_world 00:30:44.472 ************************************ 00:30:44.472 16:07:04 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:30:44.472 16:07:04 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:44.472 16:07:04 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:44.472 16:07:04 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:44.472 16:07:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:44.472 ************************************ 00:30:44.472 START TEST bdev_bounds 00:30:44.472 ************************************ 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2708402 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2708402' 00:30:44.472 Process bdevio pid: 2708402 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2708402 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2708402 ']' 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:44.472 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:44.472 16:07:04 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:44.472 [2024-07-12 16:07:04.875280] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:44.472 [2024-07-12 16:07:04.875403] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2708402 ] 00:30:44.733 [2024-07-12 16:07:05.018327] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:44.733 [2024-07-12 16:07:05.097389] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:44.733 [2024-07-12 16:07:05.097534] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:44.733 [2024-07-12 16:07:05.097535] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:44.733 [2024-07-12 16:07:05.118545] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:44.733 [2024-07-12 16:07:05.126572] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:44.733 [2024-07-12 16:07:05.134592] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:44.992 [2024-07-12 16:07:05.224900] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:47.529 [2024-07-12 16:07:07.389264] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:47.529 [2024-07-12 16:07:07.389318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:47.529 [2024-07-12 16:07:07.389327] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:47.529 [2024-07-12 16:07:07.397282] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:47.529 [2024-07-12 16:07:07.397293] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:47.529 [2024-07-12 16:07:07.397299] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:47.529 [2024-07-12 16:07:07.405302] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:47.529 [2024-07-12 16:07:07.405312] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:47.529 [2024-07-12 16:07:07.405317] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:47.529 [2024-07-12 16:07:07.413321] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:47.529 [2024-07-12 16:07:07.413331] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:47.529 [2024-07-12 16:07:07.413338] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:47.529 16:07:07 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:47.529 16:07:07 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:30:47.529 16:07:07 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:47.529 I/O targets: 00:30:47.529 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:47.529 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:30:47.529 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:47.529 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:30:47.529 00:30:47.529 00:30:47.529 CUnit - A unit testing framework for C - Version 2.1-3 00:30:47.529 http://cunit.sourceforge.net/ 00:30:47.529 00:30:47.529 00:30:47.529 Suite: bdevio tests on: crypto_ram4 00:30:47.529 Test: blockdev write read block ...passed 00:30:47.529 Test: blockdev write zeroes read block ...passed 00:30:47.529 Test: blockdev write zeroes read no split ...passed 00:30:47.529 Test: blockdev write zeroes read split ...passed 00:30:47.529 Test: blockdev write zeroes read split partial ...passed 00:30:47.529 Test: blockdev reset ...passed 00:30:47.529 Test: blockdev write read 8 blocks ...passed 00:30:47.529 Test: blockdev write read size > 128k ...passed 00:30:47.529 Test: blockdev write read invalid size ...passed 00:30:47.529 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:47.529 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:47.529 Test: blockdev write read max offset ...passed 00:30:47.529 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:47.529 Test: blockdev writev readv 8 blocks ...passed 00:30:47.529 Test: blockdev writev readv 30 x 1block ...passed 00:30:47.529 Test: blockdev writev readv block ...passed 00:30:47.529 Test: blockdev writev readv size > 128k ...passed 00:30:47.529 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:47.529 Test: blockdev comparev and writev ...passed 00:30:47.529 Test: blockdev nvme passthru rw ...passed 00:30:47.529 Test: blockdev nvme passthru vendor specific ...passed 00:30:47.529 Test: blockdev nvme admin passthru ...passed 00:30:47.529 Test: blockdev copy ...passed 00:30:47.529 Suite: bdevio tests on: crypto_ram3 00:30:47.529 Test: blockdev write read block ...passed 00:30:47.529 Test: blockdev write zeroes read block ...passed 00:30:47.529 Test: blockdev write zeroes read no split ...passed 00:30:47.529 Test: blockdev write zeroes read split ...passed 00:30:47.529 Test: blockdev write zeroes read split partial ...passed 00:30:47.529 Test: blockdev reset ...passed 00:30:47.529 Test: blockdev write read 8 blocks ...passed 00:30:47.529 Test: blockdev write read size > 128k ...passed 00:30:47.529 Test: blockdev write read invalid size ...passed 00:30:47.529 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:47.529 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:47.529 Test: blockdev write read max offset ...passed 00:30:47.529 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:47.529 Test: blockdev writev readv 8 blocks ...passed 00:30:47.529 Test: blockdev writev readv 30 x 1block ...passed 00:30:47.529 Test: blockdev writev readv block ...passed 00:30:47.529 Test: blockdev writev readv size > 128k ...passed 00:30:47.529 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:47.529 Test: blockdev comparev and writev ...passed 00:30:47.529 Test: blockdev nvme passthru rw ...passed 00:30:47.529 Test: blockdev nvme passthru vendor specific ...passed 00:30:47.529 Test: blockdev nvme admin passthru ...passed 00:30:47.529 Test: blockdev copy ...passed 00:30:47.530 Suite: bdevio tests on: crypto_ram2 00:30:47.530 Test: blockdev write read block ...passed 00:30:47.530 Test: blockdev write zeroes read block ...passed 00:30:47.530 Test: blockdev write zeroes read no split ...passed 00:30:47.530 Test: blockdev write zeroes read split ...passed 00:30:47.789 Test: blockdev write zeroes read split partial ...passed 00:30:47.789 Test: blockdev reset ...passed 00:30:47.789 Test: blockdev write read 8 blocks ...passed 00:30:47.789 Test: blockdev write read size > 128k ...passed 00:30:47.789 Test: blockdev write read invalid size ...passed 00:30:47.789 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:47.789 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:47.789 Test: blockdev write read max offset ...passed 00:30:47.789 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:47.789 Test: blockdev writev readv 8 blocks ...passed 00:30:47.789 Test: blockdev writev readv 30 x 1block ...passed 00:30:47.789 Test: blockdev writev readv block ...passed 00:30:47.789 Test: blockdev writev readv size > 128k ...passed 00:30:47.789 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:47.789 Test: blockdev comparev and writev ...passed 00:30:47.789 Test: blockdev nvme passthru rw ...passed 00:30:47.790 Test: blockdev nvme passthru vendor specific ...passed 00:30:47.790 Test: blockdev nvme admin passthru ...passed 00:30:47.790 Test: blockdev copy ...passed 00:30:47.790 Suite: bdevio tests on: crypto_ram 00:30:47.790 Test: blockdev write read block ...passed 00:30:47.790 Test: blockdev write zeroes read block ...passed 00:30:47.790 Test: blockdev write zeroes read no split ...passed 00:30:48.049 Test: blockdev write zeroes read split ...passed 00:30:48.309 Test: blockdev write zeroes read split partial ...passed 00:30:48.309 Test: blockdev reset ...passed 00:30:48.309 Test: blockdev write read 8 blocks ...passed 00:30:48.309 Test: blockdev write read size > 128k ...passed 00:30:48.309 Test: blockdev write read invalid size ...passed 00:30:48.309 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:48.309 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:48.309 Test: blockdev write read max offset ...passed 00:30:48.309 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:48.309 Test: blockdev writev readv 8 blocks ...passed 00:30:48.309 Test: blockdev writev readv 30 x 1block ...passed 00:30:48.309 Test: blockdev writev readv block ...passed 00:30:48.309 Test: blockdev writev readv size > 128k ...passed 00:30:48.309 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:48.309 Test: blockdev comparev and writev ...passed 00:30:48.309 Test: blockdev nvme passthru rw ...passed 00:30:48.309 Test: blockdev nvme passthru vendor specific ...passed 00:30:48.309 Test: blockdev nvme admin passthru ...passed 00:30:48.309 Test: blockdev copy ...passed 00:30:48.309 00:30:48.309 Run Summary: Type Total Ran Passed Failed Inactive 00:30:48.309 suites 4 4 n/a 0 0 00:30:48.309 tests 92 92 92 0 0 00:30:48.309 asserts 520 520 520 0 n/a 00:30:48.309 00:30:48.309 Elapsed time = 1.871 seconds 00:30:48.309 0 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2708402 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2708402 ']' 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2708402 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2708402 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2708402' 00:30:48.309 killing process with pid 2708402 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2708402 00:30:48.309 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2708402 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:48.569 00:30:48.569 real 0m4.076s 00:30:48.569 user 0m10.795s 00:30:48.569 sys 0m0.454s 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:48.569 ************************************ 00:30:48.569 END TEST bdev_bounds 00:30:48.569 ************************************ 00:30:48.569 16:07:08 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:30:48.569 16:07:08 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:48.569 16:07:08 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:48.569 16:07:08 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:48.569 16:07:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.569 ************************************ 00:30:48.569 START TEST bdev_nbd 00:30:48.569 ************************************ 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2709070 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2709070 /var/tmp/spdk-nbd.sock 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2709070 ']' 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:48.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:48.569 16:07:08 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:48.569 [2024-07-12 16:07:08.997439] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:30:48.569 [2024-07-12 16:07:08.997494] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:48.829 [2024-07-12 16:07:09.088714] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:48.829 [2024-07-12 16:07:09.157758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:48.829 [2024-07-12 16:07:09.178751] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:48.829 [2024-07-12 16:07:09.186769] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:48.829 [2024-07-12 16:07:09.194787] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:49.089 [2024-07-12 16:07:09.280881] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:51.628 [2024-07-12 16:07:11.450254] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:51.628 [2024-07-12 16:07:11.450303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:51.628 [2024-07-12 16:07:11.450312] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.628 [2024-07-12 16:07:11.458272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:51.628 [2024-07-12 16:07:11.458283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:51.628 [2024-07-12 16:07:11.458289] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.628 [2024-07-12 16:07:11.466291] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:51.628 [2024-07-12 16:07:11.466301] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:51.628 [2024-07-12 16:07:11.466310] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.628 [2024-07-12 16:07:11.474311] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:51.628 [2024-07-12 16:07:11.474321] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:51.628 [2024-07-12 16:07:11.474327] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:51.628 1+0 records in 00:30:51.628 1+0 records out 00:30:51.628 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00031143 s, 13.2 MB/s 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:51.628 16:07:11 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:51.628 1+0 records in 00:30:51.628 1+0 records out 00:30:51.628 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269388 s, 15.2 MB/s 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:51.628 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:51.888 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:51.889 1+0 records in 00:30:51.889 1+0 records out 00:30:51.889 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276015 s, 14.8 MB/s 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:51.889 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:52.148 1+0 records in 00:30:52.148 1+0 records out 00:30:52.148 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000270677 s, 15.1 MB/s 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:52.148 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd0", 00:30:52.408 "bdev_name": "crypto_ram" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd1", 00:30:52.408 "bdev_name": "crypto_ram2" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd2", 00:30:52.408 "bdev_name": "crypto_ram3" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd3", 00:30:52.408 "bdev_name": "crypto_ram4" 00:30:52.408 } 00:30:52.408 ]' 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd0", 00:30:52.408 "bdev_name": "crypto_ram" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd1", 00:30:52.408 "bdev_name": "crypto_ram2" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd2", 00:30:52.408 "bdev_name": "crypto_ram3" 00:30:52.408 }, 00:30:52.408 { 00:30:52.408 "nbd_device": "/dev/nbd3", 00:30:52.408 "bdev_name": "crypto_ram4" 00:30:52.408 } 00:30:52.408 ]' 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:52.408 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:52.667 16:07:12 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:53.235 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:53.496 16:07:13 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:53.758 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:54.019 /dev/nbd0 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:54.019 1+0 records in 00:30:54.019 1+0 records out 00:30:54.019 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287264 s, 14.3 MB/s 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:54.019 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:30:54.280 /dev/nbd1 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:54.280 1+0 records in 00:30:54.280 1+0 records out 00:30:54.280 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000334493 s, 12.2 MB/s 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:54.280 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:30:54.541 /dev/nbd10 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:54.541 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:54.541 1+0 records in 00:30:54.541 1+0 records out 00:30:54.541 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328398 s, 12.5 MB/s 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:54.542 16:07:14 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:30:54.803 /dev/nbd11 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:54.803 1+0 records in 00:30:54.803 1+0 records out 00:30:54.803 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300666 s, 13.6 MB/s 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:54.803 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd0", 00:30:55.064 "bdev_name": "crypto_ram" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd1", 00:30:55.064 "bdev_name": "crypto_ram2" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd10", 00:30:55.064 "bdev_name": "crypto_ram3" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd11", 00:30:55.064 "bdev_name": "crypto_ram4" 00:30:55.064 } 00:30:55.064 ]' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd0", 00:30:55.064 "bdev_name": "crypto_ram" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd1", 00:30:55.064 "bdev_name": "crypto_ram2" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd10", 00:30:55.064 "bdev_name": "crypto_ram3" 00:30:55.064 }, 00:30:55.064 { 00:30:55.064 "nbd_device": "/dev/nbd11", 00:30:55.064 "bdev_name": "crypto_ram4" 00:30:55.064 } 00:30:55.064 ]' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:55.064 /dev/nbd1 00:30:55.064 /dev/nbd10 00:30:55.064 /dev/nbd11' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:55.064 /dev/nbd1 00:30:55.064 /dev/nbd10 00:30:55.064 /dev/nbd11' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:55.064 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:55.064 256+0 records in 00:30:55.064 256+0 records out 00:30:55.064 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115003 s, 91.2 MB/s 00:30:55.065 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:55.065 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:55.326 256+0 records in 00:30:55.326 256+0 records out 00:30:55.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0436967 s, 24.0 MB/s 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:55.326 256+0 records in 00:30:55.326 256+0 records out 00:30:55.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.044255 s, 23.7 MB/s 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:55.326 256+0 records in 00:30:55.326 256+0 records out 00:30:55.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0385426 s, 27.2 MB/s 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:55.326 256+0 records in 00:30:55.326 256+0 records out 00:30:55.326 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0368686 s, 28.4 MB/s 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:55.326 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:55.587 16:07:15 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:55.847 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:56.108 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:56.369 16:07:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:56.630 malloc_lvol_verify 00:30:56.630 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:56.890 2563a9cb-606f-4420-b171-b99ec3377abc 00:30:56.890 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:57.151 f7fd7f0c-13e3-41c7-8a33-25493225378b 00:30:57.151 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:57.151 /dev/nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:57.411 mke2fs 1.46.5 (30-Dec-2021) 00:30:57.411 Discarding device blocks: 0/4096 done 00:30:57.411 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:57.411 00:30:57.411 Allocating group tables: 0/1 done 00:30:57.411 Writing inode tables: 0/1 done 00:30:57.411 Creating journal (1024 blocks): done 00:30:57.411 Writing superblocks and filesystem accounting information: 0/1 done 00:30:57.411 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2709070 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2709070 ']' 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2709070 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:57.411 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2709070 00:30:57.671 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:57.671 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:57.671 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2709070' 00:30:57.671 killing process with pid 2709070 00:30:57.671 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2709070 00:30:57.671 16:07:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2709070 00:30:57.671 16:07:18 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:57.671 00:30:57.671 real 0m9.162s 00:30:57.671 user 0m12.771s 00:30:57.671 sys 0m2.632s 00:30:57.671 16:07:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:57.671 16:07:18 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:57.671 ************************************ 00:30:57.671 END TEST bdev_nbd 00:30:57.671 ************************************ 00:30:57.933 16:07:18 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:30:57.933 16:07:18 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:57.933 16:07:18 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:57.933 16:07:18 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:57.933 16:07:18 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:57.933 ************************************ 00:30:57.933 START TEST bdev_fio 00:30:57.933 ************************************ 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:57.933 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:57.933 ************************************ 00:30:57.933 START TEST bdev_fio_rw_verify 00:30:57.933 ************************************ 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:57.933 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:57.934 16:07:18 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:58.523 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.524 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.524 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.524 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.524 fio-3.35 00:30:58.524 Starting 4 threads 00:31:13.487 00:31:13.487 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2711517: Fri Jul 12 16:07:31 2024 00:31:13.487 read: IOPS=22.7k, BW=88.7MiB/s (93.0MB/s)(887MiB/10001msec) 00:31:13.487 slat (usec): min=14, max=867, avg=53.99, stdev=44.07 00:31:13.487 clat (usec): min=8, max=2634, avg=304.14, stdev=271.22 00:31:13.487 lat (usec): min=26, max=2834, avg=358.14, stdev=305.10 00:31:13.487 clat percentiles (usec): 00:31:13.487 | 50.000th=[ 208], 99.000th=[ 1418], 99.900th=[ 1795], 99.990th=[ 2278], 00:31:13.487 | 99.999th=[ 2540] 00:31:13.487 write: IOPS=24.9k, BW=97.4MiB/s (102MB/s)(951MiB/9761msec); 0 zone resets 00:31:13.487 slat (usec): min=16, max=474, avg=70.44, stdev=50.73 00:31:13.487 clat (usec): min=25, max=3029, avg=411.27, stdev=360.50 00:31:13.487 lat (usec): min=51, max=3223, avg=481.71, stdev=401.62 00:31:13.487 clat percentiles (usec): 00:31:13.487 | 50.000th=[ 285], 99.000th=[ 1860], 99.900th=[ 2147], 99.990th=[ 2507], 00:31:13.487 | 99.999th=[ 2868] 00:31:13.487 bw ( KiB/s): min=67616, max=124112, per=97.05%, avg=96800.42, stdev=3840.00, samples=76 00:31:13.487 iops : min=16904, max=31028, avg=24200.11, stdev=960.00, samples=76 00:31:13.487 lat (usec) : 10=0.01%, 20=0.01%, 50=1.88%, 100=11.13%, 250=37.11% 00:31:13.487 lat (usec) : 500=26.83%, 750=13.26%, 1000=5.07% 00:31:13.487 lat (msec) : 2=4.51%, 4=0.20% 00:31:13.487 cpu : usr=99.58%, sys=0.00%, ctx=62, majf=0, minf=198 00:31:13.487 IO depths : 1=10.4%, 2=23.6%, 4=52.8%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:13.487 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:13.487 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:13.487 issued rwts: total=227102,243395,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:13.487 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:13.487 00:31:13.487 Run status group 0 (all jobs): 00:31:13.487 READ: bw=88.7MiB/s (93.0MB/s), 88.7MiB/s-88.7MiB/s (93.0MB/s-93.0MB/s), io=887MiB (930MB), run=10001-10001msec 00:31:13.487 WRITE: bw=97.4MiB/s (102MB/s), 97.4MiB/s-97.4MiB/s (102MB/s-102MB/s), io=951MiB (997MB), run=9761-9761msec 00:31:13.487 00:31:13.487 real 0m13.531s 00:31:13.487 user 0m49.120s 00:31:13.487 sys 0m0.492s 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:13.487 ************************************ 00:31:13.487 END TEST bdev_fio_rw_verify 00:31:13.487 ************************************ 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:13.487 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "808ca6f0-1c77-5ed3-a85f-ee0d37844a02"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "808ca6f0-1c77-5ed3-a85f-ee0d37844a02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "56c4bd7e-8488-59a2-a885-31b1ef5b2175"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "56c4bd7e-8488-59a2-a885-31b1ef5b2175",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "656c2c5c-d6b5-5325-93a0-aafbaf374ca6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "656c2c5c-d6b5-5325-93a0-aafbaf374ca6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:13.488 crypto_ram2 00:31:13.488 crypto_ram3 00:31:13.488 crypto_ram4 ]] 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "b0cdf7b3-3dca-5f0e-90b2-0710623ffafe",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "808ca6f0-1c77-5ed3-a85f-ee0d37844a02"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "808ca6f0-1c77-5ed3-a85f-ee0d37844a02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "56c4bd7e-8488-59a2-a885-31b1ef5b2175"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "56c4bd7e-8488-59a2-a885-31b1ef5b2175",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "656c2c5c-d6b5-5325-93a0-aafbaf374ca6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "656c2c5c-d6b5-5325-93a0-aafbaf374ca6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:13.488 16:07:31 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:13.488 ************************************ 00:31:13.488 START TEST bdev_fio_trim 00:31:13.488 ************************************ 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:13.488 16:07:32 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:13.488 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:13.488 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:13.488 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:13.488 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:13.488 fio-3.35 00:31:13.488 Starting 4 threads 00:31:25.709 00:31:25.709 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2713821: Fri Jul 12 16:07:45 2024 00:31:25.709 write: IOPS=69.3k, BW=271MiB/s (284MB/s)(2709MiB/10001msec); 0 zone resets 00:31:25.709 slat (usec): min=11, max=625, avg=34.77, stdev=23.76 00:31:25.709 clat (usec): min=32, max=1294, avg=164.19, stdev=121.36 00:31:25.709 lat (usec): min=43, max=1452, avg=198.96, stdev=138.18 00:31:25.709 clat percentiles (usec): 00:31:25.709 | 50.000th=[ 121], 99.000th=[ 523], 99.900th=[ 725], 99.990th=[ 938], 00:31:25.709 | 99.999th=[ 1188] 00:31:25.709 bw ( KiB/s): min=202824, max=298160, per=99.83%, avg=276870.09, stdev=6019.62, samples=77 00:31:25.709 iops : min=50706, max=74540, avg=69217.42, stdev=1504.89, samples=77 00:31:25.709 trim: IOPS=69.3k, BW=271MiB/s (284MB/s)(2709MiB/10001msec); 0 zone resets 00:31:25.709 slat (usec): min=3, max=752, avg= 7.28, stdev= 3.81 00:31:25.709 clat (usec): min=43, max=857, avg=141.41, stdev=65.91 00:31:25.709 lat (usec): min=47, max=957, avg=148.69, stdev=66.97 00:31:25.709 clat percentiles (usec): 00:31:25.709 | 50.000th=[ 129], 99.000th=[ 347], 99.900th=[ 474], 99.990th=[ 635], 00:31:25.709 | 99.999th=[ 783] 00:31:25.709 bw ( KiB/s): min=202824, max=298160, per=99.83%, avg=276871.74, stdev=6019.64, samples=77 00:31:25.709 iops : min=50706, max=74540, avg=69217.83, stdev=1504.90, samples=77 00:31:25.709 lat (usec) : 50=5.25%, 100=28.93%, 250=51.22%, 500=13.83%, 750=0.73% 00:31:25.709 lat (usec) : 1000=0.04% 00:31:25.709 lat (msec) : 2=0.01% 00:31:25.709 cpu : usr=99.75%, sys=0.00%, ctx=95, majf=0, minf=99 00:31:25.709 IO depths : 1=8.4%, 2=22.3%, 4=55.5%, 8=13.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:25.709 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:25.709 complete : 0=0.0%, 4=87.8%, 8=12.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:25.709 issued rwts: total=0,693391,693391,0 short=0,0,0,0 dropped=0,0,0,0 00:31:25.709 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:25.709 00:31:25.709 Run status group 0 (all jobs): 00:31:25.709 WRITE: bw=271MiB/s (284MB/s), 271MiB/s-271MiB/s (284MB/s-284MB/s), io=2709MiB (2840MB), run=10001-10001msec 00:31:25.709 TRIM: bw=271MiB/s (284MB/s), 271MiB/s-271MiB/s (284MB/s-284MB/s), io=2709MiB (2840MB), run=10001-10001msec 00:31:25.709 00:31:25.709 real 0m13.637s 00:31:25.709 user 0m50.378s 00:31:25.709 sys 0m0.508s 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:25.709 ************************************ 00:31:25.709 END TEST bdev_fio_trim 00:31:25.709 ************************************ 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:25.709 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:25.709 00:31:25.709 real 0m27.523s 00:31:25.709 user 1m39.697s 00:31:25.709 sys 0m1.173s 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:25.709 ************************************ 00:31:25.709 END TEST bdev_fio 00:31:25.709 ************************************ 00:31:25.709 16:07:45 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:25.709 16:07:45 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:25.709 16:07:45 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:25.709 16:07:45 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:25.709 16:07:45 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:25.709 16:07:45 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:25.709 ************************************ 00:31:25.709 START TEST bdev_verify 00:31:25.709 ************************************ 00:31:25.709 16:07:45 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:25.709 [2024-07-12 16:07:45.830173] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:25.709 [2024-07-12 16:07:45.830238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2715633 ] 00:31:25.709 [2024-07-12 16:07:45.918090] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:25.709 [2024-07-12 16:07:45.984727] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:25.709 [2024-07-12 16:07:45.984807] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:25.709 [2024-07-12 16:07:46.005972] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:25.709 [2024-07-12 16:07:46.013997] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:25.709 [2024-07-12 16:07:46.022023] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:25.709 [2024-07-12 16:07:46.107577] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:28.252 [2024-07-12 16:07:48.267670] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:28.252 [2024-07-12 16:07:48.267732] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:28.252 [2024-07-12 16:07:48.267741] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:28.252 [2024-07-12 16:07:48.275686] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:28.252 [2024-07-12 16:07:48.275697] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:28.252 [2024-07-12 16:07:48.275702] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:28.252 [2024-07-12 16:07:48.283706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:28.252 [2024-07-12 16:07:48.283719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:28.252 [2024-07-12 16:07:48.283725] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:28.252 [2024-07-12 16:07:48.291728] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:28.252 [2024-07-12 16:07:48.291739] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:28.252 [2024-07-12 16:07:48.291744] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:28.252 Running I/O for 5 seconds... 00:31:33.538 00:31:33.538 Latency(us) 00:31:33.538 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:33.538 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x0 length 0x1000 00:31:33.538 crypto_ram : 5.06 581.90 2.27 0.00 0.00 219529.76 9427.10 133088.49 00:31:33.538 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x1000 length 0x1000 00:31:33.538 crypto_ram : 5.07 479.97 1.87 0.00 0.00 265985.91 15325.34 160512.79 00:31:33.538 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x0 length 0x1000 00:31:33.538 crypto_ram2 : 5.06 581.80 2.27 0.00 0.00 218895.39 10132.87 124215.93 00:31:33.538 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x1000 length 0x1000 00:31:33.538 crypto_ram2 : 5.07 479.87 1.87 0.00 0.00 265157.42 15224.52 145187.45 00:31:33.538 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x0 length 0x1000 00:31:33.538 crypto_ram3 : 5.05 4562.17 17.82 0.00 0.00 27780.53 7108.14 24500.38 00:31:33.538 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x1000 length 0x1000 00:31:33.538 crypto_ram3 : 5.06 3755.68 14.67 0.00 0.00 33709.17 2495.41 27222.65 00:31:33.538 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x0 length 0x1000 00:31:33.538 crypto_ram4 : 5.06 4579.47 17.89 0.00 0.00 27650.22 1461.96 23492.14 00:31:33.538 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:33.538 Verification LBA range: start 0x1000 length 0x1000 00:31:33.538 crypto_ram4 : 5.06 3768.90 14.72 0.00 0.00 33563.40 1701.42 26819.35 00:31:33.538 =================================================================================================================== 00:31:33.538 Total : 18789.76 73.40 0.00 0.00 54131.91 1461.96 160512.79 00:31:33.538 00:31:33.538 real 0m7.927s 00:31:33.538 user 0m15.264s 00:31:33.538 sys 0m0.251s 00:31:33.538 16:07:53 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:33.538 16:07:53 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:33.538 ************************************ 00:31:33.538 END TEST bdev_verify 00:31:33.538 ************************************ 00:31:33.538 16:07:53 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:33.538 16:07:53 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:33.538 16:07:53 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:31:33.538 16:07:53 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:33.538 16:07:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:33.538 ************************************ 00:31:33.538 START TEST bdev_verify_big_io 00:31:33.538 ************************************ 00:31:33.538 16:07:53 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:33.538 [2024-07-12 16:07:53.846822] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:33.538 [2024-07-12 16:07:53.846875] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2716904 ] 00:31:33.538 [2024-07-12 16:07:53.937071] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:33.800 [2024-07-12 16:07:54.013633] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:33.800 [2024-07-12 16:07:54.013637] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.800 [2024-07-12 16:07:54.034835] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:33.800 [2024-07-12 16:07:54.042862] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:33.800 [2024-07-12 16:07:54.050885] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:33.800 [2024-07-12 16:07:54.133453] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:36.340 [2024-07-12 16:07:56.298064] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:36.340 [2024-07-12 16:07:56.298116] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:36.340 [2024-07-12 16:07:56.298124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:36.340 [2024-07-12 16:07:56.306080] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:36.340 [2024-07-12 16:07:56.306091] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:36.340 [2024-07-12 16:07:56.306097] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:36.340 [2024-07-12 16:07:56.314100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:36.340 [2024-07-12 16:07:56.314110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:36.340 [2024-07-12 16:07:56.314116] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:36.340 [2024-07-12 16:07:56.322120] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:36.340 [2024-07-12 16:07:56.322130] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:36.340 [2024-07-12 16:07:56.322135] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:36.340 Running I/O for 5 seconds... 00:31:36.910 [2024-07-12 16:07:57.116563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.116975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.117102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.117148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.117186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.117499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.117510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.119700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.120152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.120164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.121354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.121399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.121436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.121486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.122476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.123593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.123649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.123685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.123725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.124696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.126978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.127015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.127056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.127526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.127537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.128541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.128585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.128622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.128658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.129423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.130364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.130408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.130444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.130494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.131589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.132498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.132542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.132578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.132615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.132962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.133008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.133045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.133080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.133339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.133353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.136108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.136156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.136192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.137995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.138041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.138077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.139836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.139882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.139918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.141579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.141625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.141661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.143453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.143502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.143538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.145205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.910 [2024-07-12 16:07:57.145252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.145302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.147101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.147147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.147183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.149013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.149058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.149095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.151277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.151324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.151360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.153112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.153157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.153194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.155560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.155605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.155642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.157317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.157362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.157398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.159210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.159256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.159293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.160936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.160982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.161018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.162763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.162809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.162857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.164847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.164894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.164931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.166470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.166515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.166552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.168434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.168479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.168516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.170017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.170062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.170099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.171876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.171922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.171958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.173608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.173655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.173692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.175574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.175619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.175655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.177900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.179902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.181336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.182778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.184241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.184693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.186124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.187570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.189021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.192204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.193671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.195116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.196220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.198018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.199464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.200743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.201123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.203794] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.204919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.206360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.207830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.209242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.209626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.210872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.212301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.214930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.216394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.217859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.218516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.220336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.221781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.223261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.224773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.227483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.228037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.228417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.230088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.232001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.233628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.234932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.236349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.237959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.911 [2024-07-12 16:07:57.239647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.241331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.243011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.244618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.246056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.247494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.248928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.251754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.253270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.254796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.256015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.257846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.259297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.259854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.260234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.262941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.264395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.265881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.267419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.268167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.268552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.270003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.271440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:36.912 [2024-07-12 16:07:57.272605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.274054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.275502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.276790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.286467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.286852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.287281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.296410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.297838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.299256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.308829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.310283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.311722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.322318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.323780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.325400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.333549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.335040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.336546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.345215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.346661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:36.912 [2024-07-12 16:07:57.347183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.356538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.356924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.358296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.365507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.367100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.367482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.371688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.372106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.372502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.376980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.377382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.377791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.381928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.382329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.382724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.386834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.387237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.387627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.391830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.392246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.392640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.396934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.397336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.397734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.401828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.402228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.402616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.406513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.406930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.407321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.411330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.411737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.413028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.421164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.421829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.423079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.433247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.434909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.435974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.444173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.445645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.447109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.455680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.457347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.457756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.467582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.467986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.469539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.478555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.480066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.480108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.481773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.483426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.484862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.486354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.488017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.489457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.489512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.489562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.489613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.490017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.490068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.490119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.490171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.491977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.493640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.493694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.493750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.174 [2024-07-12 16:07:57.493804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.494222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.494274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.494325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.494376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.495515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.495567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.495619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.495669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.496047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.496102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.496153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.496205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.497765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.497818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.497868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.497921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.498294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.498345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.498397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.498447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.499617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.499671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.499727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.499778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.500146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.500196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.500247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.500312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.501788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.501841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.501892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.501942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.502320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.502372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.502423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.503373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.504549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.504603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.504658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.505915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.506246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.507863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.508138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.509976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.510326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.511944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.512007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.512053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.512333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.513982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.514032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.514080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.514400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.515978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.516032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.516313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.517473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.175 [2024-07-12 16:07:57.517525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.517581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.517633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.517955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.518059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.518109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.518162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.518213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.518493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.519972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.520021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.520075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.520123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.520447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.521837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.521889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.521940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.521999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.522815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.523843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.523895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.523947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.524891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.525777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.525830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.525881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.525932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.526792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.527646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.527725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.527779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.527833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.528851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.530895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.531315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.532862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.533280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.534941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.535376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.536952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.537002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.537053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.176 [2024-07-12 16:07:57.537102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.537514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.538979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.539031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.539080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.539360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.540958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.541012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.541062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.541537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.542953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.543002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.543053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.543105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.543468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.544998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.545047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.545097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.545145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.545426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.546394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.546446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.546498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.546555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.546940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.547057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.547105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.547155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.547205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.547543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.548464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.548517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.548569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.548635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.549733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.550568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.550620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.550671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.550730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.551611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.552490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.552543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.552594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.552645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.553787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.554671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.556085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.556298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.556415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.556465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.556651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.557605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.558008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.559593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.561204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.561484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.561608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.562870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.564537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.566195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.566474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.177 [2024-07-12 16:07:57.568034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.569480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.570936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.572605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.573003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.574534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.575976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.577622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.578718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.579170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.581572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.583232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.584183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.585628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.585956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.587676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.588324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.588717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.590244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.590558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.592740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.594235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.595783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.597437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.597901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.598354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.599812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.601236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.602880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.603226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.605645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.606923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.607324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.608284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.608596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.610115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.611780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.612781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.614233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.614559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.615868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.178 [2024-07-12 16:07:57.617081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.618633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.619076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.619365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.620602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.621003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.622078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.623492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.623836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.626146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.627351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.627746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.628133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.628511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.628969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.629377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.629786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.439 [2024-07-12 16:07:57.630174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.630607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.632068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.632469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.632875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.633264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.633730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.633843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.634236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.634636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.635035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.635521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.636812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.637211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.637600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.637998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.638421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.638882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.638997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.640910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.641308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.641701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.642095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.642582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.642752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.643146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.643534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.643933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.644314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.645639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.646054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.646449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.646857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.647267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.647726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.648121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.648512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.648906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.649351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.650688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.651093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.651485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.651885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.652294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.652750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.653155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.653546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.653943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.654313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.656004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.656403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.656800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.657200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.657597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.658062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.658461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.658859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.659245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.659627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.660972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.661369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.661768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.662161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.662629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.663088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.663487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.663884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.664274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.664681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.666492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.666902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.667296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.667683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.668075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.668525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.668925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.669318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.669705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.670074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.671888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.673542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.674016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.674404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.674683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.675248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.676525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.676921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.677393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.677697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.679414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.680874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.682528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.683580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.683873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.685562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.687108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.687497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.688062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.688404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.690157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.440 [2024-07-12 16:07:57.691610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.693056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.694699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.695067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.695520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.696728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.698167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.699605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.699887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.702241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.703911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.704300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.704686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.704975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.706626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.708275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.709523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.711196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.711475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.712640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.713372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.714833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.716291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.716570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.717572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.719024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.720475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.722132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.722523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.724790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.726232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.727911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.729083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.729425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.730936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.732571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.732963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.733350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.733627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.735713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.737392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.739057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.740723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.741001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.741453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.742102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.743556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.744995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.745273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.747627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.749290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.750118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.750505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.750790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.752306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.753761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.755442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.756633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.756948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.758178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.758577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.760100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.761567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.761848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.763074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.764739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.766357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.767989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.768267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.771365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.772915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.774551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.775832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.776112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.777840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.779508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.780957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.781344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.781810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.784241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.785197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.786619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.788055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.788333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.789459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.789860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.790935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.792379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.792748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.795068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.796530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.798175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.798575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.798974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.800486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.802030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.803620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.805226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.805512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.808034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.441 [2024-07-12 16:07:57.808432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.808824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.810324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.810603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.812328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.813501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.815165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.816841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.817119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.818587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.820028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.821479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.823150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.823570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.825120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.826600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.828253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.829209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.829648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.832063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.833720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.834663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.836125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.836443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.838173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.838780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.839169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.840641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.840982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.843085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.844522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.846001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.847669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.848115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.848572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.850239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.851907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.853592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.853873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.856274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.857822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.858211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.858277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.858737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.860241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.861700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.863362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.864279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.864608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.866319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.866375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.866430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.866483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.866915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.868657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.870335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.871990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.873410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.873690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.874534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.874588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.874638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.874688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.875825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.876676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.876734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.876786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.876835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.877651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.878481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.878533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.878582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.878632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.879750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.880620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.880672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.880729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.880778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.881633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.882455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.882507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.442 [2024-07-12 16:07:57.882557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.882607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.443 [2024-07-12 16:07:57.883663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.884587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.884640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.884691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.884750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.885616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.886508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.886561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.886611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.886666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.887148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.887318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.887369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.887421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.888736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.889044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.704 [2024-07-12 16:07:57.891755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.891806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.892090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.893131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.893186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.893237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.893288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.893652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.895174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.895228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.895284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.895332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.895612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.896484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.896536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.896588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.896638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.896925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.897029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.897078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.897130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.897183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.897606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.898664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.898726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.898778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.898828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.899700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.900526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.900584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.900636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.900687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.900971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.901075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.901125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.901177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.901225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.901663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.903768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.904113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.904948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.905640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.906157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.907950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.908231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.909851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.910270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.911933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.912215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.913194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.913264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.913320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.913373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.913848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.914018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.914069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.914122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.914175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.914582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.915619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.915671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.915727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.915778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.916214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.916315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.916365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.916418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.916485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.917006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.917861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.917919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.917976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.918945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.920453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.920507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.920564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.705 [2024-07-12 16:07:57.920618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.921705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.922821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.922874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.922925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.922978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.923998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.925780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.926218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.927488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.927548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.927599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.927651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.928674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.929663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.929721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.929793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.929845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.930947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.932998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.933053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.933103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.933528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.934442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.934494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.934546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.934598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.935772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.936885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.936944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.936996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.937702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.938140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.939994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.940053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.940102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.940521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.941499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.941552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.941604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.941666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.942842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.944181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.944234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.944286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.706 [2024-07-12 16:07:57.944674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.945899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.946898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.947295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.947684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.948080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.948447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.948578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.948628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.948697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.949136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.951028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.951427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.951824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.952210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.952689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.952823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.953216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.953603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.954001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.954447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.955935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.956334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.956730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.957123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.957537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.957996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.958399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.958797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.959184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.959602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.961183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.961583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.961982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.962369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.962707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.963166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.963564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.963969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.964357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.964786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.966248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.966647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.967045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.967432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.967915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.969488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.971162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.972558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.973609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.973893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.975199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.976876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.978543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.979175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.979504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.980702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.981099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.981486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.982885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.983168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.985843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.987501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.988234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.989846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.990125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.991215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.991612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.992699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.994135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.994493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.996812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.998257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:57.999889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.000279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.000692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.002326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.003971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.005644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.007161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.007442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.008332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.009592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.009998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.010767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.011092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.011206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.012664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.014353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.015322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.015640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.016915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.017314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.018978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.020642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.020925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.022376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.022475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.025090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.025618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.707 [2024-07-12 16:07:58.026014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.027544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.027841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.027966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.029607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.030923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.032533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.032818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.034038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.034827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.036285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.037719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.037998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.039007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.040464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.041891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.043541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.043943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.046228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.047761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.049433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.050705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.051020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.052529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.054198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.054587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.054978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.055256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.057067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.058515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.059958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.061589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.061930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.062380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.063307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.064765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.066206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.066484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.068924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.070574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.071037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.071426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.071703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.073479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.075148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.076562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.078107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.078398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.079607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.080317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.081357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.082790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.083069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.084260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.085897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.087570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.089234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.089516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.092605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.094060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.095673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.096714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.096995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.098550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.100202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.101352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.101744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.102134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.104760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.105681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.107128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.108584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.108870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.109427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.109828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.111404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.113023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.113303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.115682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.117338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.118880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.119268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.119702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.121322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.122797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.124441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.125565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.125850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.127723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.128137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.129277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.130727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.708 [2024-07-12 16:07:58.131046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.132756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.133893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.135337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.136785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.137065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.139695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.141366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.143025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.144495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.144782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.146318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.709 [2024-07-12 16:07:58.147897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.149555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.149949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.150433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.152912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.153933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.155427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.156865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.157146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.158356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.158757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.159661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.161139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.161451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.163796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.165246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.166900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.167289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.167744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.169313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.170938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.172592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.174100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.174378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.176723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.177122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.177681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.179122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.179451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.181182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.182117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.183566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.185019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.970 [2024-07-12 16:07:58.185297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.187334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.188804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.190228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.191862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.192294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.193804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.195258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.196898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.197430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.197788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.200162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.201824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.203143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.204573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.204920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.206612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.207010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.207398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.208992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.209270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.211746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.213271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.214911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.216134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.216572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.217154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.218597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.220048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.221691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.222052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.224587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.225406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.225800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.226995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.227317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.228829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.230465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.231539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.232991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.233369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.234685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.236364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.237965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.239598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.239917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.241593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.243270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.244931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.246395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.246850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.249747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.251396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.252785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.254278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.254604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.256236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.257894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.258285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.258671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.258954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.261462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.263127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.264792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.265181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.265654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.266109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.266524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.266922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.267984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.268301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.269976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.271453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.272905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.272952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.273241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.273701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.274102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.274494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.274893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.275279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.276657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.276718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.276773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.276823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.277206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.277698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.278100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.278510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.278901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.279320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.280291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.971 [2024-07-12 16:07:58.280346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.280413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.280461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.280954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.281466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.281865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.282256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.282642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.283029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.284963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.285014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.285065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.285449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.287971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.288333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.289990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.290039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.290091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.290140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.290587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.291802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.291855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.291907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.291957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.292360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.292471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.292524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.292576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.292629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.293037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.294973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.295028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.295082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.295466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.296892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.296946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.297778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.298184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.299494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.299559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.299608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.299656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.300119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.300266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.300321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.300372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.301210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.301492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.302901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.302957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.303883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.305153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.305223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.305273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.305330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.305704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.306535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.972 [2024-07-12 16:07:58.306590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.306648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.306701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.307056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.308879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.309164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.310969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.311253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.312956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.313008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.313287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.314974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.315035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.315087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.315367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.316971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.317022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.317072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.317124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.317402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.318519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.318576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.318637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.318686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.318978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.319085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.319136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.319188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.319245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.319524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.320625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.320678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.320744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.320797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.321623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.322701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.322765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.322818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.322868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.323695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.324778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.324832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.324899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.324945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.325783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.326931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.973 [2024-07-12 16:07:58.326989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.327927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.329943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.330225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.331549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.331602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.331653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.331704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.331996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.332103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.332155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.332223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.332274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.332584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.333471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.333525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.333577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.333627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.333918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.334021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.334071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.334122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.334175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.334456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.335414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.335492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.335546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.335596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.336733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.337838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.337891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.337944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.337998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.338278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.338383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.338433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.338486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.338535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.339051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.340994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.341300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.342949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.343002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.343282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.344450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.389538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.398480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.398545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.399984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.401443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.404147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.405799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.406190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.406577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.408580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.410235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.411611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.413189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:37.974 [2024-07-12 16:07:58.414680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.415459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.416917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.418379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.419633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.421103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.422553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.424205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.426894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.428450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.430093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.431465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.238 [2024-07-12 16:07:58.431758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.433307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.434977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.435375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.435886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.437072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.438030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.439523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.440961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.441247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.441371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.441782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.442179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.443654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.446396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.447856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.449503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.450460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.450881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.452015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.453994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.455513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.238 [2024-07-12 16:07:58.456957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.458608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.458979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.459044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.459438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.460554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.461993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.464632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.466082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.467759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.468156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.468624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.470200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.471646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.473321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.474248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.476192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.476602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.478049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.479558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.479858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.481487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.482968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.484543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.486214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.489369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.490819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.492462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.493395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.493698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.495221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.496876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.497758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.498157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.501001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.502250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.503733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.505202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.505488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.505953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.506527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.507992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.509471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.512216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.513870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.514454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.514855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.515140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.516853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.518511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.519878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.521401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.522913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.523693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.525139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.526581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.526871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.527861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.529333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.530792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.532407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.535296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.536966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.538410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.539965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.540256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.541982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.543460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.543875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.544644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.546791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.548258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.549719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.551353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.551730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.552223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.553887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.555566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.557210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.559953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.561278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.561685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.562584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.562922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.564450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.566116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.567117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.568565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.570269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.571945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.573578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.575215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.575512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.577166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.578847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.580519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.581909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.584994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.586360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.239 [2024-07-12 16:07:58.587829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.588224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.588644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.589398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.590428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.591722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.592178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.595065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.596532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.598025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.599612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.599907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.601480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.601892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.602588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.603660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.605451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.605867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.606270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.606676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.607182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.607641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.608050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.608450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.608863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.610699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.611130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.611539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.611940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.612365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.612832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.613260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.613660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.614060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.616254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.616673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.617076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.617472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.617888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.618350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.618758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.619171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.619564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.621458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.621886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.622288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.622682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.623158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.623632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.624044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.624444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.624848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.626808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.627216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.627623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.628029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.628440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.628905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.629307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.629705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.630112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.240 [2024-07-12 16:07:58.632466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.632880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.632930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.633331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.633763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.634221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.634615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.635013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.635400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.636914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.637313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.637363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.637761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.638311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.638780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.639174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.639566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.639961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.641491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.641898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.641948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.642336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.642691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.643174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.643572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.643624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.644030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.645848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.646249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.646299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.646970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.647274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.647398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.648860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.648909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.650538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.651855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.653526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.653577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.240 [2024-07-12 16:07:58.653973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.654421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.654547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.655491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.655542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.657086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.658772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.659859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.659923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.661564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.662036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.662176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.662568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.662617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.664100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.665360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.665766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.665838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.666554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.666918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.673127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.673183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.673235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.673772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.674196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.674245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.675693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.677001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.677056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.677107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.677157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.677530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.679194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.679244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.680388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.681963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.241 [2024-07-12 16:07:58.682660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.504 [2024-07-12 16:07:58.683992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.504 [2024-07-12 16:07:58.684046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.684721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.685965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.686659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.689327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.689384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.689458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.689505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.689995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.690047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.691606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.691656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.692830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.692884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.692936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.692988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.693432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.693482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.693538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.695197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.696997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.697719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.699733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.700984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.701739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.702826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.702880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.702933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.702985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.703367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.703417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.703469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.703520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.704762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.704817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.704875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.704928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.705386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.705437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.705488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.705541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.709361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.709416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.709486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.709535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.710012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.710063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.710116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.710168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.714968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.715019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.718949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.722528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.722588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.722656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.722705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.723121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.505 [2024-07-12 16:07:58.723172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.723224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.723276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.727613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.727670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.727729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.727780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.728294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.728344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.728395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.728446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.732689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.732752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.732808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.732859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.733289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.733341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.733392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.733443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.736672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.736734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.736787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.736837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.737276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.737327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.737380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.737436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.741956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.742788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.746980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.747031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.747081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.750983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.751034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.751085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.754543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.754597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.754653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.754704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.755125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.755176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.755231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.755281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.759668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.759728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.759780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.759830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.760301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.760352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.760403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.760802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.764995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.765727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.768905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.768961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.770586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.770635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.771148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.771199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.771250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.771304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.776009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.776065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.777501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.777549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.506 [2024-07-12 16:07:58.777946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.777997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.778048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.778099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.780151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.780206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.781869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.781918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.782510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.782560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.783441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.783491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.785782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.785838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.787263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.787312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.789149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.789203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.789593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.789641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.792541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.792596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.793467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.793515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.795405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.795459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.797099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.797148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.800216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.800271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.801892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.801943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.801999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.802422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.803941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.803995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.805430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.805479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.805499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.805784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.807120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.808565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.810037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.810086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.810364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.810484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.811614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.811664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.813168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.813485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.814440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.814545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.815527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.815578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.817011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.817326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.818227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.819679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.821203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.822860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.823337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.823454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.823995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.824152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.826346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.827928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.829561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.831225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.831505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.831618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.832027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.832931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.834387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.834700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.837088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.838541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.840187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.840735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.841131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.842656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.844239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.845895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.847403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.847682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.850228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.850627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.851209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.852650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.852974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.854109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.855776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.857440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.859103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.859577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.861113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.861513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.862799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.863055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.864588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.866237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.507 [2024-07-12 16:07:58.867089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.867138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.867476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.868370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.868775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.869337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.870540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.870824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.872006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.873667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.875198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.876844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.877328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.879816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.880371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.881727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.882974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.883478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.883609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.884006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.884394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.884789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.885215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.886713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.887112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.887505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.887899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.888224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.888677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.889077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.889469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.889867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.890280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.891772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.892173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.892565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.892959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.893425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.893885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.894280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.894674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.895069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.895534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.896949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.897353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.897753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.898143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.898621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.899086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.899481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.899884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.900272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.900636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.902171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.902571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.902972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.903359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.903838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.904292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.904688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.905088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.905477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.905915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.907598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.908009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.908404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.908800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.909228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.909678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.910079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.910473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.910868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.911279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.912756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.913158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.913554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.913951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.914348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.914807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.915211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.508 [2024-07-12 16:07:58.915602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.770 [2024-07-12 16:07:58.981935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.982930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.982975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.984072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.987580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.988329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.989743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.991160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.991470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.993048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.993092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.993381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.993423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.994901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.996041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.996264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.996278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.996290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:58.999300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.000616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.002020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.003447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.005076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.005404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.006949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.008336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.008558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.008572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.770 [2024-07-12 16:07:59.008584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.011341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.012977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.014609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.016178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.017173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.018107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.019212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.020643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.020868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.020882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.020894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.023988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.025418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.026848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.028323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.029040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.029868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.031281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.032705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.032930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.032944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.032956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.035971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.037403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.038840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.040179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.042282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.043559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.044999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.046427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.046665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.046679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.046692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.049415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.050727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.051515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.052273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.054010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.055462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.056893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.057524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.057895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.057910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.057923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.061028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.062108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.063245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.064673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.065250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.065564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.065876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.066729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.066951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.066965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.066977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.071950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.072281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.072589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.072899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.074482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.075451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.076079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.077314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.077549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.077563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.077578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.080818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.081996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.082308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.083886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.085087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.085401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.085712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.086021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.086243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.086256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.086269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.087308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.088726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.090270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.091711] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.092457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.092774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.093434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.094847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.095069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.095083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.095095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.096868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.098363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.098683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.098992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.100605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.102017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.102929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.103339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.103564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.103578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.103591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.771 [2024-07-12 16:07:59.105926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.106247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.107515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.108127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.108858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.109171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.109479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.109792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.110185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.110199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.110212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.112080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.112410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.112722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.113040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.114464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.115424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.115735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.116044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.116413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.116427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.116439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.119627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.119946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.120257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.121252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.121886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.122199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.122510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.122823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.123226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.123240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.123252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.124897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.125215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.125523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.125835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.126565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.126886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.127196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.127515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.128016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.128031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.128045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.130156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.130472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.130785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.131104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.131948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.132262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.132571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.132884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.133237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.133251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.133264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:38.772 [2024-07-12 16:07:59.135297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.135608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.135917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.136959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.137269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.137577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.137897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.137906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.139741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.140052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.140355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.140670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.141251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.141561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.141871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.142175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.142444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.142453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.144211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.144523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.144831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.145149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.145883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.146191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.146495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.146803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.147285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.147294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.148929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.149240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.149543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.149850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.150581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.150893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.150928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.151234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.151604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.151613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.153664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.153981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.154285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.154588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.155156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.772 [2024-07-12 16:07:59.156378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.156410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.156971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.157335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.157344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.160277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.161327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.161631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.161662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.162035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.162873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.162906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.163230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.163543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.163551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.167716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.167754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.168466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.168496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.169071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.169378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.169685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.171210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.171429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.171437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.175598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.175660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.175979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.176010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.177397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.177434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.178717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.179829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.180251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.180259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.184167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.184677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.186055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.187684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.188447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.190736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.193627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.194953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.195008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.196268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.196655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.196981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.197285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.197588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.197811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.197819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.197826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.198855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.199165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.199196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.199220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.199735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.199955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.200048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.201675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.203795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.210389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.211840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.211872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.213395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.213845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.213854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.213861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:38.773 [2024-07-12 16:07:59.213868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.219597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.220164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.220196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.221412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.221626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.221634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.221641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.221648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.227626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.228749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.228781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.229587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.230037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.230047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.230054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.230062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.234092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.234403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.234435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.236042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.236316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.236324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.236331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.236338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.239064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.240199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.240232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.241652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.241867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.241876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.241883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.241890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.244977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.246975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.250432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.250471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.250501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.250531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.036 [2024-07-12 16:07:59.250795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.250803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.250810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.250817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.253693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.253735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.253765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.253794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.254004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.254013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.254020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.254030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.258504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.261530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.264951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.267779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.271889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.271928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.271957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.271987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.272246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.272255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.272262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.272269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.275929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.275976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.276287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.280526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.284632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.287482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.290997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.291384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.293845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.293883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.293913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.293945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.294154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.294163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.294170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.037 [2024-07-12 16:07:59.294177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.298643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.298681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.298719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.298749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.299009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.299017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.299024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.299031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.301834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.301873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.301902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.301931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.302141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.302149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.302156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.302162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.305933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.309483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.313931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.313975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.315366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.318645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.318685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.320357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.326544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.332134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.332173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.332202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.333336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.333550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.333558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.333565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.333576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.337917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.337956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.337985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.338014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.338225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.338233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.338240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.338247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.342665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.343812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.343844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.344732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.345207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.345217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.345224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.345232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.349784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.354635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.354674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.355785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.355817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.356028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.356036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.356047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.356053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.361153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.361193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.362681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.038 [2024-07-12 16:07:59.362715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.363026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.363034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.363042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.363049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.369900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.369941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.371310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.378090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.378129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.379835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.384827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.384867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.385514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.393342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.394966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.395206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.395214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.395221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.398669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.400288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.401907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.403337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.403761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.405118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.406145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.407270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.407483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.407491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.407498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.410911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.412025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.413454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.414877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.415514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.417035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.418162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.419916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.420221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.420229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.420236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.423923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.425565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.427135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.428586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.430304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.431440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.432574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.434026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.434239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.434248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.434255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.437558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.438844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.440367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.440721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.441695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.442981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.443292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.443594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.444009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.444019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.444026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.448361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.449635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.450275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.450593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.451295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.452816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.454448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.454752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.454966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.454977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.454985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.457245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.458405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.459692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.460721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.462270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.463520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.464653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.465759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.466146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.466154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.466161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.469371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.039 [2024-07-12 16:07:59.470721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.471417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.472847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.473478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.473788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.474805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.476079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.476292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.476300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.476307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.478293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.478606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.478962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.040 [2024-07-12 16:07:59.480484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.481538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.482380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.483419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.484851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.485211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.485220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.485227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.487761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.303 [2024-07-12 16:07:59.488830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.488905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.488968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.489028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.489445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.489504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.489563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.489618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.303 [2024-07-12 16:07:59.491835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.491900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.491970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.493209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.493271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.493331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.493393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.493789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.495287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.495347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.495411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.496595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.496662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.496729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.496790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.497396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.497455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.498426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.498485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.499601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.499664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.499728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.499788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.500263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.500321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.500380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.500444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.501544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.501624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.501685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.501749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.502298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.502373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.502429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.502487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.503751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.503819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.503879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.503934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.504291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.504350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.504410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.504470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.505624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.505687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.505754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.505814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.506390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.506450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.506512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.506574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.507868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.507932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.507996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.508052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.508415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.508474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.508535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.508591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.509887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.509950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.510593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.511831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.511895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.511956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.512012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.512372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.512435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.514091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.515727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.515784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.515812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.516282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.516302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.517909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.518279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.518301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.519559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.519623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.519685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.519750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.520724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.521807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.521869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.521928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.521992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.522355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.522418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.522478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.304 [2024-07-12 16:07:59.522534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.522917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.522938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.523887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.523950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.524936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.526752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.527033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.527055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.528944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.529232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.529254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.530833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.531117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.531138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.532985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.533049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.533120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.533551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.533572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.534479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.534541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.534611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.534671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535222] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.535665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.536363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.536424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.536498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.536560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.536995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.537054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.537707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.537772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.538253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.538276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.539854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.540137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.540158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.543877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.543941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.544934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.545774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.545836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.545894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.545954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.546430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.546489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.547863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.305 [2024-07-12 16:07:59.547921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.548322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.548344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.549172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.549691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.549756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.551197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.553184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.553259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.554349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.554405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.554749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.554771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.557762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.559280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.559339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.561005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.562042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.562105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.563464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.563525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.563818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.563840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.567177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.568848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.568908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.570120] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.571918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.571984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.573632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.573689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.574087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.574109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.577638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.579202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.579272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.580030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.580411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.580470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.582131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.582188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.582486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.582507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.584350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.586011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.586071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.587231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.588502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.588577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.589605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.589667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.589959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.589980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.593230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.594911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.594970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.595030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.596166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.596229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.597198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.597256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.597543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.597565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.600901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.602591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.604079] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.605400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.605506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.605528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.609306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.610992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.612658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.613693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.614265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.615938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.616369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.616394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.618045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.618402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.618433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.618458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.623341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.623805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.625155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.626471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.628165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.628903] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.630000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.631416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.631871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.631894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.631918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.636032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.637565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.638582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.639334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.641232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.642901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.644009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.644658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.644954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.306 [2024-07-12 16:07:59.644978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.645013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.647740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.648867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.649272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.649667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.651336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.651780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.651913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.652797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.653198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.653594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.653904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.653941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.653965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.658152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.659199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.659901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.660297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.660690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.662217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.662373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.662397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.662423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.666421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.668103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.668499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.668901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.669347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.669409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.670258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.671893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.672287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.672700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.672733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.672758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.674786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.675194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.675597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.675997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.676446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.676915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.677332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.677736] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.678134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.678556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.678581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.678606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.680998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.681415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.681818] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.682211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.682666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.683148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.683550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.683955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.684350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.684761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.684784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.684820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.686849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.687261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.687666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.688064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.688573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.689046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.689447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.689851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.690248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.690706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.690738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.690762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.693193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.693610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.694026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.694689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.695005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.696187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.696590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.696992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.698488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.698807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.307 [2024-07-12 16:07:59.698829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.698852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.701659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.703023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.703427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.703826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.704157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.705795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.706200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.707333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.707979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.708265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.708287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.708317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.711962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.712810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.714476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.714875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.715160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.715858] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.716916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.717318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.717901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.718187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.718208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.718232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.721914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.723479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.724438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.725809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.726216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.726352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.726411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.726816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.727099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.727121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.727145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.731024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.732702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.734384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.735356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.735684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.735827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.736297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.736692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.738204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.738518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.738540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.738563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.742966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.744613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.308 [2024-07-12 16:07:59.745755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.747055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.747421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.747885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.749511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.751188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.752837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.753177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.753201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.753225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:39.570 [2024-07-12 16:07:59.766065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.767717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.769059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.769502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.769517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.773301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.774754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.776210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.777876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.778681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.780013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.781441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.782866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.783144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.783158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.786983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.787381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.788825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.790269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.791727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.793324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.794908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.796538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.796855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.796869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.801319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.802271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.803716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.805160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.806278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.806677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.808150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.809654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.809986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.810000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.815602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.816894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.817391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.818887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.820722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.822163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.823812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.824756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.825065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.825079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.829816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.830215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.831667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.833121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.834609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.835914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.837547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.839180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.839577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.839592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.843602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.845265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.846928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.848569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.849353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.850005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.851441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.851680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.851694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.857457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.858387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.858783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.859844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.860185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.861614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.863259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.864178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.864506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.864519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.868993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.870673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.570 [2024-07-12 16:07:59.872346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.873831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.875614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.875671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.877327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.878522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.879001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.879016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.882753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.884198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.885833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.887494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.888417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.889568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.891011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.892446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.892738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.892752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.895314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.895370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.896849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.896897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.897832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.899269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.900683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.902339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.902718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.902732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.907216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.907272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.908441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.908488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.910332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.910385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.911937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.911985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.912280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.912293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.916732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.916787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.918243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.918292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.920326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.920382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.921568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.921616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.921966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.921980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.926801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.926857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.928291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.928339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.929555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.929611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.931088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.931136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.931425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.931439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.936012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.936067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.937729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.937777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.939788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.939843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.940974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.942519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.942992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.943011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.948401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.948456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.949098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.950640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.951445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.951498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.952489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.952536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.952862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.952875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.957557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.957612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.957664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.958061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.960237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.960292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.961962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.962012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.962311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.962325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.965614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.965684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.965741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.965792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.967161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.967217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.968374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.968423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.968790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.968808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.972997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.973054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.571 [2024-07-12 16:07:59.973105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.973171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.974980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.975042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.975094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.976743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.976791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.977204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.977218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.977232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.980950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.981300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.981313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.981327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.985852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.986135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.986148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.986162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.989878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.989933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.990002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.990056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.990881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.990936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.990988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.991039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.991375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.991389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.991404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.994653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.994708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.994766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.994816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.995310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.996950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.997000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.997051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.997416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.997429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:07:59.997443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.000459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.000515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.000569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.000624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.001551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.004718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.004773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.004811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.004848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.005748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.008602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.008649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.008686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.008734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.009535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.011434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.011482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.011523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.011560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.012596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.014392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.572 [2024-07-12 16:08:00.014440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.014476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.014512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.014945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.014986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.015023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.573 [2024-07-12 16:08:00.015059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.015525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.015537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.015546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.017508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.017556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.017593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.017629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.018625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.020507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.020556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.020598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.838 [2024-07-12 16:08:00.020643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.021157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.021535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.021929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.021970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.022313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.022324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.022333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.025526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.025575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.025617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.025655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.026477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.026520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.026557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.026592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.027002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.027014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.027025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.029462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.029531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.029580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.029630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.029649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.030981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.032962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.033738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.034163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.034175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.034185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.036765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.037180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.037194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.037203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.039765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.040174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.040184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.040194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.043978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.044044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.044560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.044572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.044581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.047668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.047723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.047761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.047797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.048578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.051965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.052007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.839 [2024-07-12 16:08:00.052044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.052443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.052454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.052462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.055903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.055948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.055984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.056021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.056439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.056537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.056574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.056615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.058223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.058486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.058496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.058505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.060630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.060700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.060741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.060789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.061715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.063735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.063785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.063825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.063862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.064121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.064550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.064590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.064634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.064673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.065108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.065122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.065132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.068540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.068586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.068623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.068663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.069745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.073210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.073256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.073630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.073669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.074743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.078902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.078947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.079896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.079944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.080206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.081387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.081431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.081810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.081848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.082243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.082253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.082262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.086644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.086690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.087067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.087105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.087501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.089045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.089086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.090086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.090124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.090432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.090442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.090450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.094052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.094099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.095753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.095791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.096055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.097236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.097277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.097649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.097686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.098089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.098099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.098108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.840 [2024-07-12 16:08:00.103755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.103801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.105422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.105460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.105727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.106783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.112215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.112275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.112312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.112348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.112609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.114300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.114341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.114717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.114757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.115220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.115231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.115240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.120788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.122311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.122452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.124151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.124192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.124564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.124602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.125027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.125038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.125047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.129003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.130660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.132287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.133914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.134179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.134638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.134690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.135334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.135373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.135673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.135684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.135693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.140521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.142164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.142674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.143050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.143315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.144847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.146269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.146309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.147958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.148272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.148282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.148291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.152693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.154356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.156010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.157656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.157928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.159295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.160731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.161174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.161548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.161816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.161826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.161835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.167444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.169096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.169474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.169850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.170113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.171752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.173391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.174685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.176247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.176531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.176541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.176550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.181049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.182502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.184158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.185210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.185475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.185598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.187022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.188648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.189788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.190247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.190257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.190266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.194121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.195568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.197011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.198650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.199027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.841 [2024-07-12 16:08:00.199496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.199624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.199635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.199649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.205453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.207110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.208761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.210271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.210690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.210826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.211351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.212788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.214228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.214492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.214502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.214511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.218750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.219131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.220164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.221600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.221898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.223621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.224535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.225974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.227418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.227682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.227692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.227701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.231950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.233609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.234736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.236159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.236471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.238184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.238587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.238965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.240570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.240859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.240870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.240879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.246221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.247863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.248239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.248632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.248901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.250402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.252046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.253230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.254882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.255148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.255158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.255167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.259824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.261257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.262924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.264052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.264317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.266019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.267566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.269216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.269593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.270041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.270052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.270062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.275424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.276819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.277288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.277663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.277930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:39.842 [2024-07-12 16:08:00.279613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.281231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.282684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.283865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.284202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.284216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.284231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.288705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.289532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.291027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.292685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.293035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.293490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.295014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.296269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.297050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.297366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.297381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.297395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.301298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.302140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.303633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.305109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.305393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.305510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.305561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.305963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.306336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.306349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.306364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.150 [2024-07-12 16:08:00.311397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.312120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.312511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.313851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.314177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.314232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.315594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.316501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.317950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.318268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.318282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.318296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.321673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.321781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.323270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.324625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.325047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.325434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.325769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.325783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.325796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.329687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.330090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.331112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.332551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.332898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.333699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.335229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.336825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.337213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.337701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.337720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.337733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.341972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.342374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.343504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.344944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.345353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.346937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.347346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.347740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.349018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.349448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.349464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.349477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.354054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.355047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.356331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.357760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.358253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.358716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.359110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.359503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.359898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.360295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.360309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.360323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.362704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.363124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.363518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.363911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.364370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.364828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.365221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.365615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.366008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.366364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.366378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.366392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.369275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.369674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.370072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.370462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.370940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.371420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.371822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.372218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.372605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.373022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.373036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.373050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.375175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.375576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.375979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.376370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.376810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.377272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.377667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.378068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.378456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.378920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.378934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.378947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.381108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.381508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.381924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.382311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.382775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.151 [2024-07-12 16:08:00.383249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.383642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.384039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.384090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.384426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.384441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.384454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.387749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.388148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.388538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.388931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.389213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.390912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.391309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.391697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.393372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.393654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.393668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.393682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.397103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.398252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.398886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.399276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.399603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.399736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.400452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.401673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.403314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.403760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.403775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.403788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.407163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.408494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.408941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.410480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.410869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.411328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.412657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.414102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.415597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.415985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.416000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.416013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.419815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.419882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.421551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.421599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.421890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.422347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.422748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.424295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.425961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.426318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.426332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.426345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.430039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.430094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.430943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.430991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.431373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.431494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.433147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.433198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.433688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.434051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.434065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.434079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.438996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.439053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.439451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.439499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.439858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.439981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.441062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.441113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.442772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.443205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.443219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.443233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.447321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.447377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.449027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.449078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.449593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.449738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.450215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.450266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.451727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.452008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.452022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.452036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.455539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.455595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.456697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.456751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.457080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.457203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.458828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.460150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.461745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.152 [2024-07-12 16:08:00.462026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.462040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.462055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.466676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.466740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.468197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.469831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.470165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.470289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.471755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.471805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.473254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.473539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.473554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.473568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.478223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.478279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.478330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.479245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.479614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.479750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.481198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.481248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.482887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.483370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.483384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.483398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.486857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.488296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.488346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.489951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.490234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.490248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.490262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.493799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.495266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.495314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.495691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.495705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.495724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.498839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.498897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.498952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.499938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.500907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.500960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.501991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.502009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.502022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.505838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.505894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.505950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.506000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.506402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.506852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.506905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.506957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.507009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.507292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.507306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.507320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.508870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.510715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.513692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.153 [2024-07-12 16:08:00.513752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.513805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.513856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.514780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.515751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.515805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.515856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.515907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.516987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.517003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.517018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.521932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.522215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.522229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.522249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.523374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.523429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.523480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.523531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.523883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.524480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.525471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.525525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.525576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.525626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.525970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.526719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.527690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.527751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.527804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.527855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.528748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.529639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.529692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.529755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.529805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.530088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.530195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.530584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.530979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.531030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.154 [2024-07-12 16:08:00.531356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.531370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.531384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.532353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.532406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.532458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.532508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.532793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.534858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.536818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.537079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.537090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.537099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.539983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.540020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.540057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.540478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.540488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.540497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.541525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.541569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.541606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.541641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.541996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.542483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.543981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.544017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.544346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.544356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.544365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.545983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.546242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.546253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.546262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.547894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.548220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.548231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.548240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.549780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.550040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.550051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.155 [2024-07-12 16:08:00.550059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.551696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.552080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.552582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.552594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.552605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.556726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.556772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.556809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.556845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.557670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.560548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.560594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.560630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.560666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.560934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.561959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.562401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.563971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.564007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.564043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.564366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.564376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.564385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.565208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.566852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.566892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.568747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.569007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.569018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.569026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.570070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.571352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.571392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.573047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.573358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.573834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.573876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.574882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.574920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.575329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.575339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.575348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.576296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.577733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.577773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.578530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.578978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.579596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.579637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.581100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.581138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.581458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.581469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.581478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.156 [2024-07-12 16:08:00.582448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.583599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.583640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.585295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.585624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.586072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.586113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.587036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.587074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.587371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.587382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.587390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.588368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.589527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.589566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.591216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.591621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.592532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.593482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.595137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.595177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.595213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.595639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.596091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.596132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.597468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.597516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.597783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.597793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.597802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.598691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.600147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.600522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.600734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.601591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.601633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.602986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.603025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.603365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.603375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.603388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.604218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.604596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.604991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.606646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.606917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.421 [2024-07-12 16:08:00.607707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.607754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.608610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.608662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.609136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.609147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.609156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.611193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.612357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.614017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.614773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.615158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.616199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.617631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.617670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.619018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.619343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.619353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.619362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.621466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.621856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.622458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.623646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.623915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.624409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.625749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.626126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.626670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.626982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.626994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.627003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.628375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.628770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.629157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.629531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.629888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.630347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.630728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.631104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.631489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.631891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.631903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.631911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.633466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.633867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.634249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.634624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.634968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.635093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.635468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.635848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.636224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.636627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.636637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.636646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.640137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.641396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.641929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.642303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.642744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.643190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.643276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.643287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.643296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.645610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.645998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.646374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.646752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.647161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.647272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.647646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.648027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.648403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.648750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.648761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.648770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.650166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.650547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.650928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.651301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.651681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.652130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.652508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.652889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.653264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.653660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.653674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.653682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.655322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.655723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.656105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.656478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.656746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.658151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.658592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.658969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.659343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.659605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.659616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.659625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.422 [2024-07-12 16:08:00.661001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.661575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.662795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.664421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.664805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.665267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.665900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.667252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.668412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.668749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.668759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.668768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.670452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.671817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.672964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.673349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.673796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.674264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.675260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.676697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.678049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.678454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.678464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.678473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.681099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.681644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.682890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.683278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.683735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.685228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.686883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.687411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.688679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.689129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.689140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.689150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.691774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.692731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.693889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.694936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.695342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.695807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.697431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.698393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.699552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.699864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.699875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.699887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.702969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.704630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.705601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.706768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.707908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.710143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.711591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.712949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.713373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.713744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.713788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.714982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.716405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.716988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.717295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.717304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.717313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.719317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.719402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.720827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.721250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.722899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.724186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.724660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.724675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.724685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.726043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.726723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.728215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.728589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.728974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.730523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.731996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.732915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.734532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.734799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.734810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.734819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.736967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.738399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.740030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.740825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.741118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.742655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.744295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.744959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.423 [2024-07-12 16:08:00.745333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.745598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.745609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.745618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.750981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.752401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.754040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.754417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.754801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.756318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.757793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.759278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.760927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.761240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.761250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.761259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.763825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.764209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.764584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.766078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.766428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.768037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.769688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.770868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.772336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.772651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.772666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.772679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.774003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.775639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.777257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.778883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.779170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.780866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.782531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.784195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.784583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.785025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.785039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.785060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.790318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.790910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.791304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.792717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.793054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.794597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.796261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.797516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.798955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.799266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.799280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.799294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.802102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.803762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.805430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.806859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.807178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.808707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.810379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.810772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.810822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.811301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.811316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.811330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.813839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.814781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.816215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.817839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.818120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.818977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.819373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.820582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.822031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.822341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.822355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.822368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.824616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.826131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.827799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.828188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.828673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.828802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.830241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.831655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.833321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.833659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.833672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.833687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.836188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.837044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.837434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.838627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.838954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.840464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.842091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.843184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.844634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.844963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.844977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.844990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.424 [2024-07-12 16:08:00.846314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.846370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.847805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.847853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.848181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.849897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.850816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.852242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.853698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.853984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.853998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.854012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.856175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.856231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.857789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.857837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.858121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.858242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.859309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.859359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.860822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.861153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.861166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.861180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.862536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.862592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.864100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.864149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.864431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.425 [2024-07-12 16:08:00.864555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.866203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.866254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.867176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.867493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.867506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.867520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.869091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.869148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.869545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.869594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.869883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.870005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.871449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.871499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.873152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.873491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.873505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.873518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.876109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.876165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.876563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.876612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.877051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.877173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.878783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.880371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.882026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.882329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.882343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.882358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.883635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.883695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.884093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.885607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.885894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.886021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.887681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.887736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.888352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.888654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.888669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.888683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.890775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.890829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.890881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.892307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.892647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.892774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.893584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.893634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.895081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.895361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.895374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.895388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.896473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.896526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.896578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.896629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.897033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.897140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.898238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.898289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.899764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.900051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.900065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.900078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.901957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.902007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.903667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.903723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.904037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.904051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.904063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.905024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.905078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.905129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.690 [2024-07-12 16:08:00.905180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.905546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.905656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.905707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.905770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.905819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.906321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.906337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.906353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.907970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.908262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.908276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.908289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.909369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.909427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.909483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.909533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.909890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.911874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.912885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.912940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.912989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.913040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.913481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.913652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.914735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.914785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.914835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.915125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.915139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.915152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.916933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.917302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.917317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.917332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.918342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.918395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.918444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.918499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.918922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.919489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.920635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.920690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.920753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.920809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.921716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.922733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.922788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.922840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.922893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.923417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.923589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.923645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.923699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.923759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.924102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.924116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.924130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.691 [2024-07-12 16:08:00.925827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.925881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.926364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.926384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.926399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.927411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.927465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.927516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.927566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.927910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.928639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.929662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.929725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.929783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.929833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.930118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.930223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.931413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.932878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.932927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.933233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.933246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.933260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.934336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.934389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.934443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.934495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.934847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.936699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.937924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.937979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.938988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.939003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.941378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.941439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.941491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.941542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.941959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.942765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.943645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.943698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.943756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.943808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.944776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.946828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.947252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.947266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.947282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.948460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.948514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.948568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.948619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.692 [2024-07-12 16:08:00.949829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.949844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.950832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.950884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.950935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.950986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.951518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.951687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.951744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.951799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.951853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.952273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.952287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.952301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.953293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.953347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.953398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.953449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.953946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.954743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.955744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.955797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.955867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.955914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.956283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.956390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.956440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.956492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.956889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.957399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.957412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.957426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.958963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.959426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.960432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.960485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.960536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.960588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.960960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.961912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.961966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.962029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.962084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.962369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.962383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.962396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.963978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.964583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.965487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.965902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.965953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.967624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.968891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.970037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.971184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.971233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.972810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.973253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.973708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.973769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.974645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.974693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.975138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.975153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.975166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.976191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.976589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.693 [2024-07-12 16:08:00.976639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.977954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.978418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.980150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.980203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.980592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.980678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.981083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.981097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.981111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.982351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.983415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.983467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.983862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.984243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.985924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.985976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.987072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.987125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.987557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.987573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.987587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.988707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.989904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.989953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.991470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.991924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.992856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.993847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.994650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.994701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.994763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.995191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.996722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.996774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.998254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.998303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.998584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.998598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.998611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.999544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:00.999948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.001607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.001856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.002643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.002697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.003100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.003148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.003471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.003485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.003498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.004443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.004848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.005239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.006885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.007168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.008188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.008241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.008759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.008808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.009145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.009159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.009172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.011808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.012464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.012859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.013246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.013560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.014149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.014541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.014592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.016126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.016553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.016571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.694 [2024-07-12 16:08:01.016584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.017884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.019438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.021054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.022716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.023072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.024303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.025303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.025692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.026536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.026915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.026929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.026943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.028169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.028567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.030223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.031724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.032106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.033315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.034521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.034929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.035817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.036229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.036243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.036257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.038565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.040234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.040628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.041031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.041314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.041433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.043104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.044089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.045772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.046154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.046168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.046181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.048201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.049755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.050145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.050532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.050844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.051391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.051487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.051501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.051516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.052834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.053232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.054893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.056281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.056691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.056824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.057215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.057601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.059263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.059652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.059666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.059679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.061271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.061673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.063184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.064726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.065006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.066520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.068065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.069732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.070119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.070614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.070629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.070644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.072942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.074281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.075713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.077193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.077470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.077927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.078413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.079845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.081138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.081458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.081472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.081486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.083857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.085513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.086565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.086958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.087303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.088803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.090270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.091920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.092860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.093208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.093222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.093236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.094499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.094904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.096426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.097936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.098214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.099445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.695 [2024-07-12 16:08:01.101131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.101521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.101913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.102192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.102206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.102219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.104450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.106053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.107400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.109031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.109425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.109883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.111553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.113219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.114893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.115174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.115188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.115201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.116405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.116809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.118390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.119912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.120191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.121435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.123107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.124770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.126443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.126926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.126942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.126957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.129451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.131103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.696 [2024-07-12 16:08:01.132022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.133676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.133963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.134080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.134129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.135128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.135554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.960 [2024-07-12 16:08:01.135569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.135583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.138179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.139817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.140742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.142177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.142458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.142513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.144189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.144933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.145319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.145598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.145612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.145636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.147848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.147957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.149456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.150929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.152578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.153226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.153633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.153647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.153660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.154491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.156032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.157375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.158977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.159259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.160972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.162484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.162877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.163552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.163875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.163889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.163903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.165732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.167199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.168647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.170299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.170675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.171128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.172696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.174369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.176025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.176305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.176318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.176332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.178811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.180189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.180580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.181081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.181406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.182949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.184603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.185530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.186967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.187296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.187310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.187324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.188615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.189907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.191334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.192775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.193054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.194205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.195651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.961 [2024-07-12 16:08:01.197010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.197568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.197942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.197956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.197970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.200443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.201916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.203115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.204466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.204875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.205330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.206676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.208098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.209572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.209857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.209870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.209883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.211554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.212742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.213898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.214696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.214986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.216501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.218164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.219123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.219510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.219805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.219820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.219834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.222332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.224005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.225487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.225879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.226361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.227766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.229397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.230733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.230781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.231071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.231085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.231099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.232433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.233973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.235469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.237111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.237438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.239132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.240554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.242222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.242609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.243062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.243076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.243091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.245197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.246853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.248248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.249906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.250317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.250438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.250836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.252239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.252939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.253256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.253270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.253284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.255270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.256430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.962 [2024-07-12 16:08:01.258066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.258762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.259045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.260726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.261119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.261505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.263036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.263316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.263330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.263344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.264964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.265020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.265417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.265465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.265753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.267406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.268285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.269168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.269567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.269952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.269966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.269980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.272013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.272066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.273719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.273770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.274215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.274338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.274819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.274870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.276343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.276704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.276723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.276741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.279337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.279391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.280582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.280630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.281059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.281188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.282413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.282462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.283715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.283994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.284007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.284021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.285289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.285350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.285822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.285871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.286194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.286314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.287123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.287174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.288622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.288964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.288978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.288991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.290277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.290330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.291984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.292033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.292315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.292438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.293673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.295348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.963 [2024-07-12 16:08:01.295741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.296119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.296133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.296146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.297871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.297928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.298327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.298719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.298998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.299121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.299520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.299569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.299973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.300465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.300478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.300493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.301996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.302051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.302103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.302491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.302948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.303078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.303469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.303518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.303913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.304197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.304211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.304226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.305785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.306175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.306224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.307489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.307984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.307997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.308012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.309779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.310449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.310499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.310935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.310949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.310963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.311882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.311934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.311985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.312668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.313105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.313119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.313133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.314205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.964 [2024-07-12 16:08:01.314259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.314310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.314361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.314864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.315644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.316719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.316773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.316829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.316883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.317259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.317701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.317760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.317812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.317863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.318338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.318353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.318368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.319552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.319609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.319661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.319719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.320221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.320395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.320797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.320849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.320901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.321353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.321367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.321381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.322461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.322515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.322566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.322616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.323860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.324953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.325958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.326270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.326284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.326297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.327979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.328029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.965 [2024-07-12 16:08:01.328081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.328131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.328510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.328524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.328539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.329576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.329629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.329681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.329739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.330650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.331722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.331775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.331832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.331886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.332870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.333851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.333904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.333956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.334904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.335919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.335990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.336048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.336099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.336442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.336546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.337920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.339384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.339433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.339727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.339741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.339755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.341223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.341286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.341340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.341393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.341703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.342781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.342834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.342888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.342939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.343419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.343433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.343447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.344942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.344995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.345048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.966 [2024-07-12 16:08:01.345098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.345994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.346007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.346021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.348950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.349768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.350129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.350142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.350157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.351958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.352241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.352255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.352269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.353980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.354260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.354277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.354291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.355282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.355334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.355385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.967 [2024-07-12 16:08:01.355453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.355927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.356588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.357649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.357702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.357759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.357812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.358740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.359727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.359783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.359833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.359886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.360866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.362765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.363345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.363628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.363643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.363661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.364784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.364837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.364888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.364939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.365954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.367981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.968 [2024-07-12 16:08:01.368628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.369634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.369687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.369744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.369795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.370750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.372158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.372705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.372760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.373740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.374674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.375694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.376277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.376327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.377788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.378069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.379590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.379643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.381082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.381130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.381409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.381423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.381436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.382941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.384432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.384482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.386128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.386519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.388017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.388070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.389505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.389554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.389839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.389853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.389867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.391136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.392611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.392661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.394297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.394574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.395548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.395602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.397061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.397109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.397432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.397446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.397460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.398593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.400158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.400208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.401655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.401943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.402930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.402984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.403038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.403090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.403371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.403385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:40.969 [2024-07-12 16:08:01.403399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.404682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.406154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.406204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.406255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.406535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.407915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.407968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.409526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.409573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.409910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.409924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.409938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.411058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.412269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.413726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.413965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.415689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.415746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.416900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.416948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.417332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.417345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.417359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.418403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.418983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.420429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.421873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.422154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.423152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.423206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.424355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.424402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.424737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.424751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.424765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.427135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.428597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.430237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.431276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.431597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.433119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.433720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.433770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.434208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.434516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.434530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.434544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.436162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.437602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.439246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.440243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.440763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.441622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.443065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.444524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.446173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.446523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.446537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.446550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.449044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.449443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.449889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.233 [2024-07-12 16:08:01.451312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.451636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.453374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.454349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.455807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.457221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.457504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.457518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.457532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.459812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.461253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.462732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.464401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.464776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.464894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.466392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.467945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.469614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.470101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.470115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.470129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.472872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.474521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.475529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.477010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.477291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.478994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.479091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.479105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.479119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.482073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.483563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.485204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.486379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.486659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.486792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.488228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.489879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.490957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.491418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.491432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.491446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.493773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.495413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.496240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.497588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.497964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.498449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.499781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.501210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.502705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.502988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.503002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.503015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.504576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.504980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.506655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.508326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.508607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.509636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.511091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.512748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.513164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.513605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.513619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.513632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.515400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.516843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.518288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.518931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.519355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.520656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.521810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.522653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.524101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.524447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.524461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.524475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.525805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.527477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.529140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.530802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.531083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.532074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.532881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.533269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.534595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.534918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.534933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.534947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.537266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.538721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.539111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.539498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.539781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.541425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.542417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.543862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.545251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.234 [2024-07-12 16:08:01.545624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.545639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.545652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.548085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.548628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.549992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.551259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.551726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.552465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.553928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.555282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.556287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.556579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.556593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.556606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.558056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.559722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.561395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.562339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.562650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.562770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.562821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.563229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.563536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.563551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.563569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.565024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.566596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.566990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.567394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.567703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.567767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.569433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.570665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.572104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.572407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.572421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.572435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.574713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.574826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.576215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.577767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.579378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.579769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.580193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.580207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.580221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.581068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.582745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.583754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.585200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.585479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.585954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.586348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.587906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.589517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.589802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.589816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.589829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.592322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.593602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.594008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.594579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.594965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.596221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.597893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.598280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.598668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.598952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.598966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.598980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.600194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.600594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.601030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.602386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.602664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.603139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.603533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.604020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.605327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.605738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.605752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.605765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.607224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.608642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.610317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.610705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.611176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.611691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.612995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.614525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.614916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.615391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.615405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.615418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.616812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.617224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.617975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.618992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.619454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.235 [2024-07-12 16:08:01.619915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.620415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.621672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.623156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.623591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.623605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.623618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.626596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.627000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.627390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.628051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.628366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.628827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.629220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.629611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.630005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.630438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.630453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.630467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.632046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.632447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.632860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.633246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.633594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.634053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.634445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.634841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.634890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.635363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.635377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.635391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.636996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.638357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.639514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.640145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.640576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.641039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.642640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.643992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.644419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.644852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.644865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.644880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.646143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.646542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.646945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.648612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.648896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.649018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.650137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.650756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.651154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.651476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.651490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.651503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.654182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.654584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.654979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.655380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.655661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.656127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.656521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.658173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.658569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.658858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.658872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.658886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.661783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.661840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.663302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.663350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.663697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.665060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.666592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.666987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.667388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.667669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.667683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.667697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.669270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.669326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.669725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.669789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.670081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.670218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.671637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.671688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.673375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.673696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.673714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.673728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.676572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.236 [2024-07-12 16:08:01.676627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.678282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.678332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.678615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.678748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.680100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.680150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.681713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.682133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.682147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.682161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.685013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.685069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.685824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.685874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.686219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.686342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.686740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.686789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.687177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.687466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.687480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.687494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.688899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.688956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.690363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.690411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.690797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.690919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.692364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.693827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.694526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.499 [2024-07-12 16:08:01.694811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.694825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.694839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.696268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.696324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.696722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.697989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.698373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.698497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.698893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.698943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.699526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.699880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.699893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.699907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.702517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.702574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.702626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.703020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.703447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.703572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.704637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.704687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.705641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.705948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.705962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.705976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.706953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.707006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.707058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.707108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.707544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.707651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.708053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.708103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.709237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.709519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.709533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.709546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.710461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.710513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.710565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.710616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.711064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.711248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.711297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.712181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.712230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.712590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.712607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.712621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.713525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.713579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.713630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.713681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.713966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.714760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.715790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.715843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.715894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.715958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.716837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.718013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.718067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.718122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.718173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.718516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.720745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.721678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.721735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.721787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.721841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.722265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.722371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.722801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.722850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.722901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.723236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.723250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.723264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.724953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.725004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.725355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.725369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.725383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.500 [2024-07-12 16:08:01.726940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.726991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.727041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.727332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.727345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.727359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.728994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.729064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.729573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.729588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.729603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.730669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.730728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.730782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.730832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.731841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.732837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.732890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.732941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.732992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.733909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.734891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.734942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.734994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.735702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.736129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.736143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.736157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.737455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.737513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.737568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.737619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.737962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.738068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.739654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.741061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.741110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.741394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.741408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.741422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.742599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.742652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.742704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.742762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.743044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.744613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.744665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.744722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.744773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.745057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.745070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.745084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.746926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.747330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.747344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.747357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.749885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.749941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.749992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.750739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.751021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.501 [2024-07-12 16:08:01.751035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.751048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.752909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.753196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.753209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.753223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.754953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.755233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.755247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.755260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.756950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.757485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.758997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.759049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.759106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.759405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.759418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.759432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.760647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.760699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.760756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.760807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.761150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.761252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.761302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.761360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.502 [2024-07-12 16:08:01.761411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:41.762 00:31:41.762 Latency(us) 00:31:41.762 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:41.762 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:41.762 Verification LBA range: start 0x0 length 0x100 00:31:41.762 crypto_ram : 5.56 48.73 3.05 0.00 0.00 2531265.22 19358.33 2077793.67 00:31:41.763 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x100 length 0x100 00:31:41.763 crypto_ram : 5.75 44.54 2.78 0.00 0.00 2796909.49 129055.51 2606921.26 00:31:41.763 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x0 length 0x100 00:31:41.763 crypto_ram2 : 5.56 48.73 3.05 0.00 0.00 2452520.82 19660.80 2077793.67 00:31:41.763 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x100 length 0x100 00:31:41.763 crypto_ram2 : 5.75 44.53 2.78 0.00 0.00 2696664.62 128248.91 2606921.26 00:31:41.763 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x0 length 0x100 00:31:41.763 crypto_ram3 : 5.48 361.46 22.59 0.00 0.00 321751.72 23592.96 445241.50 00:31:41.763 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x100 length 0x100 00:31:41.763 crypto_ram3 : 5.56 280.90 17.56 0.00 0.00 407394.68 13913.80 571070.62 00:31:41.763 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x0 length 0x100 00:31:41.763 crypto_ram4 : 5.55 378.20 23.64 0.00 0.00 300516.61 2873.50 353289.45 00:31:41.763 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:41.763 Verification LBA range: start 0x100 length 0x100 00:31:41.763 crypto_ram4 : 5.66 303.61 18.98 0.00 0.00 368612.07 4940.41 487184.54 00:31:41.763 =================================================================================================================== 00:31:41.763 Total : 1510.70 94.42 0.00 0.00 629269.78 2873.50 2606921.26 00:31:42.024 00:31:42.024 real 0m8.644s 00:31:42.024 user 0m16.599s 00:31:42.024 sys 0m0.321s 00:31:42.024 16:08:02 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:42.024 16:08:02 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:42.024 ************************************ 00:31:42.024 END TEST bdev_verify_big_io 00:31:42.024 ************************************ 00:31:42.024 16:08:02 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:42.024 16:08:02 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:42.024 16:08:02 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:42.024 16:08:02 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:42.024 16:08:02 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:42.285 ************************************ 00:31:42.285 START TEST bdev_write_zeroes 00:31:42.285 ************************************ 00:31:42.285 16:08:02 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:42.285 [2024-07-12 16:08:02.602816] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:42.285 [2024-07-12 16:08:02.602942] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718330 ] 00:31:42.553 [2024-07-12 16:08:02.743750] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:42.553 [2024-07-12 16:08:02.820505] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:42.553 [2024-07-12 16:08:02.841517] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:42.553 [2024-07-12 16:08:02.849544] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:42.553 [2024-07-12 16:08:02.857567] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:42.553 [2024-07-12 16:08:02.941388] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:45.097 [2024-07-12 16:08:05.105443] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:45.097 [2024-07-12 16:08:05.105491] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:45.097 [2024-07-12 16:08:05.105499] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.097 [2024-07-12 16:08:05.113462] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:45.097 [2024-07-12 16:08:05.113473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:45.097 [2024-07-12 16:08:05.113479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.097 [2024-07-12 16:08:05.121482] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:45.097 [2024-07-12 16:08:05.121493] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:45.097 [2024-07-12 16:08:05.121498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.097 [2024-07-12 16:08:05.129502] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:45.097 [2024-07-12 16:08:05.129512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:45.097 [2024-07-12 16:08:05.129517] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:45.097 Running I/O for 1 seconds... 00:31:46.038 00:31:46.038 Latency(us) 00:31:46.038 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:46.038 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:46.038 crypto_ram : 1.02 2348.49 9.17 0.00 0.00 54145.04 4637.93 64124.46 00:31:46.038 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:46.038 crypto_ram2 : 1.02 2361.80 9.23 0.00 0.00 53633.98 4637.93 59688.17 00:31:46.038 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:46.038 crypto_ram3 : 1.02 18177.51 71.01 0.00 0.00 6955.21 2117.32 8872.57 00:31:46.038 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:46.038 crypto_ram4 : 1.02 18214.44 71.15 0.00 0.00 6923.07 2142.52 7208.96 00:31:46.038 =================================================================================================================== 00:31:46.038 Total : 41102.24 160.56 0.00 0.00 12339.02 2117.32 64124.46 00:31:46.038 00:31:46.038 real 0m3.961s 00:31:46.038 user 0m3.633s 00:31:46.038 sys 0m0.292s 00:31:46.038 16:08:06 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:46.038 16:08:06 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:46.038 ************************************ 00:31:46.038 END TEST bdev_write_zeroes 00:31:46.038 ************************************ 00:31:46.299 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:31:46.299 16:08:06 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:46.299 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:46.299 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:46.299 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:46.299 ************************************ 00:31:46.299 START TEST bdev_json_nonenclosed 00:31:46.299 ************************************ 00:31:46.299 16:08:06 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:46.299 [2024-07-12 16:08:06.607961] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:46.299 [2024-07-12 16:08:06.608016] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719046 ] 00:31:46.299 [2024-07-12 16:08:06.697333] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:46.559 [2024-07-12 16:08:06.774088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:46.559 [2024-07-12 16:08:06.774141] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:46.559 [2024-07-12 16:08:06.774153] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:46.559 [2024-07-12 16:08:06.774159] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:46.559 00:31:46.559 real 0m0.291s 00:31:46.559 user 0m0.184s 00:31:46.559 sys 0m0.105s 00:31:46.559 16:08:06 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:31:46.559 16:08:06 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:46.559 16:08:06 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:46.559 ************************************ 00:31:46.559 END TEST bdev_json_nonenclosed 00:31:46.559 ************************************ 00:31:46.559 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:31:46.559 16:08:06 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:31:46.559 16:08:06 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:46.559 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:31:46.559 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:46.559 16:08:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:46.559 ************************************ 00:31:46.559 START TEST bdev_json_nonarray 00:31:46.559 ************************************ 00:31:46.559 16:08:06 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:46.559 [2024-07-12 16:08:06.951038] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:46.559 [2024-07-12 16:08:06.951095] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719085 ] 00:31:46.820 [2024-07-12 16:08:07.045182] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:46.820 [2024-07-12 16:08:07.121664] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:46.820 [2024-07-12 16:08:07.121731] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:46.820 [2024-07-12 16:08:07.121744] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:46.820 [2024-07-12 16:08:07.121751] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:46.820 00:31:46.820 real 0m0.271s 00:31:46.820 user 0m0.172s 00:31:46.820 sys 0m0.097s 00:31:46.820 16:08:07 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:31:46.820 16:08:07 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:46.820 16:08:07 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:46.820 ************************************ 00:31:46.820 END TEST bdev_json_nonarray 00:31:46.820 ************************************ 00:31:46.820 16:08:07 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:31:46.820 16:08:07 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:31:46.820 00:31:46.820 real 1m9.441s 00:31:46.820 user 2m45.951s 00:31:46.820 sys 0m6.726s 00:31:46.820 16:08:07 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:46.820 16:08:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:46.820 ************************************ 00:31:46.820 END TEST blockdev_crypto_aesni 00:31:46.820 ************************************ 00:31:47.081 16:08:07 -- common/autotest_common.sh@1142 -- # return 0 00:31:47.081 16:08:07 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:47.081 16:08:07 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:47.081 16:08:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:47.081 16:08:07 -- common/autotest_common.sh@10 -- # set +x 00:31:47.081 ************************************ 00:31:47.081 START TEST blockdev_crypto_sw 00:31:47.081 ************************************ 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:47.081 * Looking for test storage... 00:31:47.081 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2719150 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2719150 00:31:47.081 16:08:07 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 2719150 ']' 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:47.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:47.081 16:08:07 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:47.081 [2024-07-12 16:08:07.475742] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:47.081 [2024-07-12 16:08:07.475796] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719150 ] 00:31:47.341 [2024-07-12 16:08:07.565551] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:47.341 [2024-07-12 16:08:07.632603] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:47.912 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:47.912 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:31:47.912 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:47.912 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:31:47.912 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:31:47.912 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:47.912 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.172 Malloc0 00:31:48.172 Malloc1 00:31:48.172 true 00:31:48.172 true 00:31:48.172 true 00:31:48.172 [2024-07-12 16:08:08.514079] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:48.172 crypto_ram 00:31:48.172 [2024-07-12 16:08:08.522104] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:48.172 crypto_ram2 00:31:48.172 [2024-07-12 16:08:08.530126] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:48.172 crypto_ram3 00:31:48.172 [ 00:31:48.172 { 00:31:48.172 "name": "Malloc1", 00:31:48.173 "aliases": [ 00:31:48.173 "ee1418a2-252a-4910-a672-4ee5bbdb55f4" 00:31:48.173 ], 00:31:48.173 "product_name": "Malloc disk", 00:31:48.173 "block_size": 4096, 00:31:48.173 "num_blocks": 4096, 00:31:48.173 "uuid": "ee1418a2-252a-4910-a672-4ee5bbdb55f4", 00:31:48.173 "assigned_rate_limits": { 00:31:48.173 "rw_ios_per_sec": 0, 00:31:48.173 "rw_mbytes_per_sec": 0, 00:31:48.173 "r_mbytes_per_sec": 0, 00:31:48.173 "w_mbytes_per_sec": 0 00:31:48.173 }, 00:31:48.173 "claimed": true, 00:31:48.173 "claim_type": "exclusive_write", 00:31:48.173 "zoned": false, 00:31:48.173 "supported_io_types": { 00:31:48.173 "read": true, 00:31:48.173 "write": true, 00:31:48.173 "unmap": true, 00:31:48.173 "flush": true, 00:31:48.173 "reset": true, 00:31:48.173 "nvme_admin": false, 00:31:48.173 "nvme_io": false, 00:31:48.173 "nvme_io_md": false, 00:31:48.173 "write_zeroes": true, 00:31:48.173 "zcopy": true, 00:31:48.173 "get_zone_info": false, 00:31:48.173 "zone_management": false, 00:31:48.173 "zone_append": false, 00:31:48.173 "compare": false, 00:31:48.173 "compare_and_write": false, 00:31:48.173 "abort": true, 00:31:48.173 "seek_hole": false, 00:31:48.173 "seek_data": false, 00:31:48.173 "copy": true, 00:31:48.173 "nvme_iov_md": false 00:31:48.173 }, 00:31:48.173 "memory_domains": [ 00:31:48.173 { 00:31:48.173 "dma_device_id": "system", 00:31:48.173 "dma_device_type": 1 00:31:48.173 }, 00:31:48.173 { 00:31:48.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:48.173 "dma_device_type": 2 00:31:48.173 } 00:31:48.173 ], 00:31:48.173 "driver_specific": {} 00:31:48.173 } 00:31:48.173 ] 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.173 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.173 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:31:48.173 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.173 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.173 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.173 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "69faa954-aee2-5e71-8d22-22c271308f95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "69faa954-aee2-5e71-8d22-22c271308f95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:48.433 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2719150 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 2719150 ']' 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 2719150 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2719150 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2719150' 00:31:48.433 killing process with pid 2719150 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 2719150 00:31:48.433 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 2719150 00:31:48.695 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:48.695 16:08:08 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:48.695 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:31:48.695 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:48.695 16:08:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:48.695 ************************************ 00:31:48.695 START TEST bdev_hello_world 00:31:48.695 ************************************ 00:31:48.695 16:08:09 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:48.695 [2024-07-12 16:08:09.061122] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:48.695 [2024-07-12 16:08:09.061167] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719479 ] 00:31:48.955 [2024-07-12 16:08:09.149667] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:48.955 [2024-07-12 16:08:09.224466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:48.955 [2024-07-12 16:08:09.361525] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:48.955 [2024-07-12 16:08:09.361568] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:48.955 [2024-07-12 16:08:09.361576] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:48.955 [2024-07-12 16:08:09.369543] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:48.955 [2024-07-12 16:08:09.369554] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:48.955 [2024-07-12 16:08:09.369560] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:48.955 [2024-07-12 16:08:09.377563] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:48.955 [2024-07-12 16:08:09.377572] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:48.955 [2024-07-12 16:08:09.377578] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:49.215 [2024-07-12 16:08:09.414233] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:49.215 [2024-07-12 16:08:09.414256] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:49.215 [2024-07-12 16:08:09.414266] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:49.215 [2024-07-12 16:08:09.415536] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:49.215 [2024-07-12 16:08:09.415582] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:49.215 [2024-07-12 16:08:09.415590] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:49.215 [2024-07-12 16:08:09.415614] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:49.215 00:31:49.215 [2024-07-12 16:08:09.415623] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:49.215 00:31:49.215 real 0m0.528s 00:31:49.215 user 0m0.368s 00:31:49.215 sys 0m0.149s 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:49.215 ************************************ 00:31:49.215 END TEST bdev_hello_world 00:31:49.215 ************************************ 00:31:49.215 16:08:09 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:31:49.215 16:08:09 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:49.215 16:08:09 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:49.215 16:08:09 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:49.215 16:08:09 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:49.215 ************************************ 00:31:49.215 START TEST bdev_bounds 00:31:49.215 ************************************ 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2719515 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2719515' 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:49.215 Process bdevio pid: 2719515 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2719515 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2719515 ']' 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:49.215 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:49.215 16:08:09 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:49.475 [2024-07-12 16:08:09.667352] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:49.475 [2024-07-12 16:08:09.667406] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2719515 ] 00:31:49.475 [2024-07-12 16:08:09.757977] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:49.475 [2024-07-12 16:08:09.829152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:49.475 [2024-07-12 16:08:09.829300] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:49.475 [2024-07-12 16:08:09.829302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:49.735 [2024-07-12 16:08:09.971431] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:49.735 [2024-07-12 16:08:09.971478] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:49.735 [2024-07-12 16:08:09.971486] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:49.735 [2024-07-12 16:08:09.979451] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:49.735 [2024-07-12 16:08:09.979462] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:49.735 [2024-07-12 16:08:09.979468] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:49.735 [2024-07-12 16:08:09.987473] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:49.735 [2024-07-12 16:08:09.987484] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:49.735 [2024-07-12 16:08:09.987489] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:50.306 I/O targets: 00:31:50.306 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:31:50.306 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:31:50.306 00:31:50.306 00:31:50.306 CUnit - A unit testing framework for C - Version 2.1-3 00:31:50.306 http://cunit.sourceforge.net/ 00:31:50.306 00:31:50.306 00:31:50.306 Suite: bdevio tests on: crypto_ram3 00:31:50.306 Test: blockdev write read block ...passed 00:31:50.306 Test: blockdev write zeroes read block ...passed 00:31:50.306 Test: blockdev write zeroes read no split ...passed 00:31:50.306 Test: blockdev write zeroes read split ...passed 00:31:50.306 Test: blockdev write zeroes read split partial ...passed 00:31:50.306 Test: blockdev reset ...passed 00:31:50.306 Test: blockdev write read 8 blocks ...passed 00:31:50.306 Test: blockdev write read size > 128k ...passed 00:31:50.306 Test: blockdev write read invalid size ...passed 00:31:50.306 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:50.306 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:50.306 Test: blockdev write read max offset ...passed 00:31:50.306 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:50.306 Test: blockdev writev readv 8 blocks ...passed 00:31:50.306 Test: blockdev writev readv 30 x 1block ...passed 00:31:50.306 Test: blockdev writev readv block ...passed 00:31:50.306 Test: blockdev writev readv size > 128k ...passed 00:31:50.306 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:50.306 Test: blockdev comparev and writev ...passed 00:31:50.306 Test: blockdev nvme passthru rw ...passed 00:31:50.306 Test: blockdev nvme passthru vendor specific ...passed 00:31:50.306 Test: blockdev nvme admin passthru ...passed 00:31:50.306 Test: blockdev copy ...passed 00:31:50.306 Suite: bdevio tests on: crypto_ram 00:31:50.306 Test: blockdev write read block ...passed 00:31:50.306 Test: blockdev write zeroes read block ...passed 00:31:50.306 Test: blockdev write zeroes read no split ...passed 00:31:50.306 Test: blockdev write zeroes read split ...passed 00:31:50.306 Test: blockdev write zeroes read split partial ...passed 00:31:50.306 Test: blockdev reset ...passed 00:31:50.306 Test: blockdev write read 8 blocks ...passed 00:31:50.306 Test: blockdev write read size > 128k ...passed 00:31:50.306 Test: blockdev write read invalid size ...passed 00:31:50.306 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:50.306 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:50.306 Test: blockdev write read max offset ...passed 00:31:50.306 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:50.306 Test: blockdev writev readv 8 blocks ...passed 00:31:50.306 Test: blockdev writev readv 30 x 1block ...passed 00:31:50.306 Test: blockdev writev readv block ...passed 00:31:50.306 Test: blockdev writev readv size > 128k ...passed 00:31:50.306 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:50.306 Test: blockdev comparev and writev ...passed 00:31:50.306 Test: blockdev nvme passthru rw ...passed 00:31:50.306 Test: blockdev nvme passthru vendor specific ...passed 00:31:50.306 Test: blockdev nvme admin passthru ...passed 00:31:50.306 Test: blockdev copy ...passed 00:31:50.306 00:31:50.306 Run Summary: Type Total Ran Passed Failed Inactive 00:31:50.306 suites 2 2 n/a 0 0 00:31:50.306 tests 46 46 46 0 0 00:31:50.306 asserts 260 260 260 0 n/a 00:31:50.306 00:31:50.306 Elapsed time = 0.148 seconds 00:31:50.306 0 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2719515 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2719515 ']' 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2719515 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:50.306 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2719515 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2719515' 00:31:50.567 killing process with pid 2719515 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2719515 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2719515 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:31:50.567 00:31:50.567 real 0m1.271s 00:31:50.567 user 0m3.439s 00:31:50.567 sys 0m0.287s 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:50.567 ************************************ 00:31:50.567 END TEST bdev_bounds 00:31:50.567 ************************************ 00:31:50.567 16:08:10 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:31:50.567 16:08:10 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:50.567 16:08:10 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:31:50.567 16:08:10 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:50.567 16:08:10 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:50.567 ************************************ 00:31:50.567 START TEST bdev_nbd 00:31:50.567 ************************************ 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2719834 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2719834 /var/tmp/spdk-nbd.sock 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2719834 ']' 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:50.567 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:50.567 16:08:10 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:50.827 [2024-07-12 16:08:11.022614] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:31:50.827 [2024-07-12 16:08:11.022661] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:50.827 [2024-07-12 16:08:11.110511] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:50.827 [2024-07-12 16:08:11.174344] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:51.087 [2024-07-12 16:08:11.312144] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:51.088 [2024-07-12 16:08:11.312184] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:51.088 [2024-07-12 16:08:11.312192] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:51.088 [2024-07-12 16:08:11.320162] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:51.088 [2024-07-12 16:08:11.320173] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:51.088 [2024-07-12 16:08:11.320178] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:51.088 [2024-07-12 16:08:11.328182] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:51.088 [2024-07-12 16:08:11.328192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:51.088 [2024-07-12 16:08:11.328198] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:51.659 16:08:11 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:51.659 1+0 records in 00:31:51.659 1+0 records out 00:31:51.659 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243892 s, 16.8 MB/s 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:51.659 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:51.919 1+0 records in 00:31:51.919 1+0 records out 00:31:51.919 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271329 s, 15.1 MB/s 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:51.919 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:52.179 { 00:31:52.179 "nbd_device": "/dev/nbd0", 00:31:52.179 "bdev_name": "crypto_ram" 00:31:52.179 }, 00:31:52.179 { 00:31:52.179 "nbd_device": "/dev/nbd1", 00:31:52.179 "bdev_name": "crypto_ram3" 00:31:52.179 } 00:31:52.179 ]' 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:52.179 { 00:31:52.179 "nbd_device": "/dev/nbd0", 00:31:52.179 "bdev_name": "crypto_ram" 00:31:52.179 }, 00:31:52.179 { 00:31:52.179 "nbd_device": "/dev/nbd1", 00:31:52.179 "bdev_name": "crypto_ram3" 00:31:52.179 } 00:31:52.179 ]' 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:52.179 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:52.439 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:52.698 16:08:12 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:52.957 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:52.957 /dev/nbd0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:53.216 1+0 records in 00:31:53.216 1+0 records out 00:31:53.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317575 s, 12.9 MB/s 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:31:53.216 /dev/nbd1 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:53.216 1+0 records in 00:31:53.216 1+0 records out 00:31:53.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000275103 s, 14.9 MB/s 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:53.216 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:53.475 { 00:31:53.475 "nbd_device": "/dev/nbd0", 00:31:53.475 "bdev_name": "crypto_ram" 00:31:53.475 }, 00:31:53.475 { 00:31:53.475 "nbd_device": "/dev/nbd1", 00:31:53.475 "bdev_name": "crypto_ram3" 00:31:53.475 } 00:31:53.475 ]' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:53.475 { 00:31:53.475 "nbd_device": "/dev/nbd0", 00:31:53.475 "bdev_name": "crypto_ram" 00:31:53.475 }, 00:31:53.475 { 00:31:53.475 "nbd_device": "/dev/nbd1", 00:31:53.475 "bdev_name": "crypto_ram3" 00:31:53.475 } 00:31:53.475 ]' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:53.475 /dev/nbd1' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:53.475 /dev/nbd1' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:53.475 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:53.475 256+0 records in 00:31:53.476 256+0 records out 00:31:53.476 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125473 s, 83.6 MB/s 00:31:53.476 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:53.476 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:53.735 256+0 records in 00:31:53.735 256+0 records out 00:31:53.735 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0158883 s, 66.0 MB/s 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:53.735 256+0 records in 00:31:53.735 256+0 records out 00:31:53.735 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0252985 s, 41.4 MB/s 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:53.735 16:08:13 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:53.735 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:53.994 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:53.995 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:53.995 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:53.995 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:53.995 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:54.254 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:54.513 malloc_lvol_verify 00:31:54.513 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:54.513 d2d49b2a-7a97-4a7f-bf30-942193b57f57 00:31:54.513 16:08:14 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:54.772 2befab2b-3dad-4510-90bb-38bb8f805850 00:31:54.772 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:55.032 /dev/nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:55.032 mke2fs 1.46.5 (30-Dec-2021) 00:31:55.032 Discarding device blocks: 0/4096 done 00:31:55.032 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:55.032 00:31:55.032 Allocating group tables: 0/1 done 00:31:55.032 Writing inode tables: 0/1 done 00:31:55.032 Creating journal (1024 blocks): done 00:31:55.032 Writing superblocks and filesystem accounting information: 0/1 done 00:31:55.032 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2719834 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2719834 ']' 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2719834 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:55.032 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2719834 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2719834' 00:31:55.292 killing process with pid 2719834 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2719834 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2719834 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:55.292 00:31:55.292 real 0m4.688s 00:31:55.292 user 0m7.038s 00:31:55.292 sys 0m1.365s 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:55.292 ************************************ 00:31:55.292 END TEST bdev_nbd 00:31:55.292 ************************************ 00:31:55.292 16:08:15 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:31:55.292 16:08:15 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:55.292 16:08:15 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:31:55.292 16:08:15 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:31:55.292 16:08:15 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:55.292 16:08:15 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:55.292 16:08:15 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:55.292 16:08:15 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:55.292 ************************************ 00:31:55.292 START TEST bdev_fio 00:31:55.292 ************************************ 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:55.292 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:55.292 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:55.293 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:55.554 ************************************ 00:31:55.554 START TEST bdev_fio_rw_verify 00:31:55.554 ************************************ 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:55.554 16:08:15 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:55.814 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:55.814 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:55.814 fio-3.35 00:31:55.814 Starting 2 threads 00:32:08.074 00:32:08.074 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2721036: Fri Jul 12 16:08:26 2024 00:32:08.074 read: IOPS=30.8k, BW=120MiB/s (126MB/s)(1204MiB/10000msec) 00:32:08.074 slat (usec): min=8, max=468, avg=13.60, stdev= 4.04 00:32:08.074 clat (usec): min=4, max=632, avg=102.70, stdev=43.95 00:32:08.074 lat (usec): min=13, max=647, avg=116.30, stdev=45.56 00:32:08.074 clat percentiles (usec): 00:32:08.074 | 50.000th=[ 98], 99.000th=[ 206], 99.900th=[ 255], 99.990th=[ 424], 00:32:08.074 | 99.999th=[ 586] 00:32:08.074 write: IOPS=37.1k, BW=145MiB/s (152MB/s)(1374MiB/9472msec); 0 zone resets 00:32:08.074 slat (usec): min=8, max=368, avg=23.73, stdev= 4.86 00:32:08.074 clat (usec): min=16, max=2285, avg=139.21, stdev=68.03 00:32:08.074 lat (usec): min=33, max=2308, avg=162.95, stdev=70.17 00:32:08.074 clat percentiles (usec): 00:32:08.074 | 50.000th=[ 133], 99.000th=[ 289], 99.900th=[ 338], 99.990th=[ 668], 00:32:08.074 | 99.999th=[ 2212] 00:32:08.074 bw ( KiB/s): min=125912, max=164448, per=94.16%, avg=139857.68, stdev=7287.60, samples=38 00:32:08.074 iops : min=31478, max=41112, avg=34964.42, stdev=1821.90, samples=38 00:32:08.074 lat (usec) : 10=0.01%, 20=0.01%, 50=10.44%, 100=30.78%, 250=54.83% 00:32:08.074 lat (usec) : 500=3.92%, 750=0.01%, 1000=0.01% 00:32:08.074 lat (msec) : 4=0.01% 00:32:08.074 cpu : usr=99.70%, sys=0.01%, ctx=36, majf=0, minf=713 00:32:08.074 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:08.074 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:08.074 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:08.074 issued rwts: total=308280,351714,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:08.074 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:08.074 00:32:08.074 Run status group 0 (all jobs): 00:32:08.074 READ: bw=120MiB/s (126MB/s), 120MiB/s-120MiB/s (126MB/s-126MB/s), io=1204MiB (1263MB), run=10000-10000msec 00:32:08.074 WRITE: bw=145MiB/s (152MB/s), 145MiB/s-145MiB/s (152MB/s-152MB/s), io=1374MiB (1441MB), run=9472-9472msec 00:32:08.074 00:32:08.074 real 0m11.004s 00:32:08.074 user 0m29.368s 00:32:08.074 sys 0m0.289s 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:08.074 ************************************ 00:32:08.074 END TEST bdev_fio_rw_verify 00:32:08.074 ************************************ 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "69faa954-aee2-5e71-8d22-22c271308f95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "69faa954-aee2-5e71-8d22-22c271308f95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:08.074 crypto_ram3 ]] 00:32:08.074 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "69faa954-aee2-5e71-8d22-22c271308f95"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "69faa954-aee2-5e71-8d22-22c271308f95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "91817087-4db1-5a5f-8bc6-d3a5d6c7d0b7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:08.075 16:08:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:08.075 ************************************ 00:32:08.075 START TEST bdev_fio_trim 00:32:08.075 ************************************ 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:08.075 16:08:27 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:08.075 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:08.075 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:08.075 fio-3.35 00:32:08.075 Starting 2 threads 00:32:18.066 00:32:18.066 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2723027: Fri Jul 12 16:08:37 2024 00:32:18.066 write: IOPS=48.0k, BW=188MiB/s (197MB/s)(1876MiB/10001msec); 0 zone resets 00:32:18.066 slat (usec): min=10, max=2118, avg=17.46, stdev= 6.53 00:32:18.066 clat (usec): min=11, max=2537, avg=138.45, stdev=81.70 00:32:18.066 lat (usec): min=38, max=2568, avg=155.91, stdev=85.26 00:32:18.066 clat percentiles (usec): 00:32:18.066 | 50.000th=[ 110], 99.000th=[ 347], 99.900th=[ 445], 99.990th=[ 701], 00:32:18.066 | 99.999th=[ 832] 00:32:18.066 bw ( KiB/s): min=136072, max=209672, per=99.66%, avg=191453.05, stdev=13418.14, samples=38 00:32:18.066 iops : min=34018, max=52418, avg=47863.26, stdev=3354.54, samples=38 00:32:18.066 trim: IOPS=48.0k, BW=188MiB/s (197MB/s)(1876MiB/10001msec); 0 zone resets 00:32:18.066 slat (usec): min=4, max=412, avg= 8.40, stdev= 3.37 00:32:18.067 clat (usec): min=37, max=567, avg=92.29, stdev=33.20 00:32:18.067 lat (usec): min=44, max=582, avg=100.69, stdev=34.17 00:32:18.067 clat percentiles (usec): 00:32:18.067 | 50.000th=[ 89], 99.000th=[ 200], 99.900th=[ 251], 99.990th=[ 338], 00:32:18.067 | 99.999th=[ 529] 00:32:18.067 bw ( KiB/s): min=136072, max=209680, per=99.66%, avg=191454.32, stdev=13418.43, samples=38 00:32:18.067 iops : min=34018, max=52420, avg=47863.58, stdev=3354.61, samples=38 00:32:18.067 lat (usec) : 20=0.01%, 50=9.44%, 100=44.70%, 250=40.94%, 500=4.89% 00:32:18.067 lat (usec) : 750=0.02%, 1000=0.01% 00:32:18.067 lat (msec) : 4=0.01% 00:32:18.067 cpu : usr=99.68%, sys=0.01%, ctx=40, majf=0, minf=318 00:32:18.067 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:18.067 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:18.067 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:18.067 issued rwts: total=0,480336,480336,0 short=0,0,0,0 dropped=0,0,0,0 00:32:18.067 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:18.067 00:32:18.067 Run status group 0 (all jobs): 00:32:18.067 WRITE: bw=188MiB/s (197MB/s), 188MiB/s-188MiB/s (197MB/s-197MB/s), io=1876MiB (1967MB), run=10001-10001msec 00:32:18.067 TRIM: bw=188MiB/s (197MB/s), 188MiB/s-188MiB/s (197MB/s-197MB/s), io=1876MiB (1967MB), run=10001-10001msec 00:32:18.067 00:32:18.067 real 0m11.057s 00:32:18.067 user 0m27.481s 00:32:18.067 sys 0m0.345s 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:18.067 ************************************ 00:32:18.067 END TEST bdev_fio_trim 00:32:18.067 ************************************ 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:18.067 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:18.067 00:32:18.067 real 0m22.397s 00:32:18.067 user 0m57.028s 00:32:18.067 sys 0m0.808s 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:18.067 ************************************ 00:32:18.067 END TEST bdev_fio 00:32:18.067 ************************************ 00:32:18.067 16:08:38 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:18.067 16:08:38 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:18.067 16:08:38 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:18.067 16:08:38 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:18.067 16:08:38 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:18.067 16:08:38 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:18.067 ************************************ 00:32:18.067 START TEST bdev_verify 00:32:18.067 ************************************ 00:32:18.067 16:08:38 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:18.067 [2024-07-12 16:08:38.247526] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:18.067 [2024-07-12 16:08:38.247572] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2724585 ] 00:32:18.067 [2024-07-12 16:08:38.336665] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:18.067 [2024-07-12 16:08:38.415626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:18.067 [2024-07-12 16:08:38.415630] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:18.327 [2024-07-12 16:08:38.556060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:18.327 [2024-07-12 16:08:38.556107] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:18.327 [2024-07-12 16:08:38.556116] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.327 [2024-07-12 16:08:38.564076] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:18.327 [2024-07-12 16:08:38.564087] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:18.327 [2024-07-12 16:08:38.564099] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.327 [2024-07-12 16:08:38.572098] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:18.327 [2024-07-12 16:08:38.572108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:18.327 [2024-07-12 16:08:38.572113] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.327 Running I/O for 5 seconds... 00:32:23.608 00:32:23.608 Latency(us) 00:32:23.608 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:23.608 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:23.608 Verification LBA range: start 0x0 length 0x800 00:32:23.609 crypto_ram : 5.01 6975.77 27.25 0.00 0.00 18271.43 1209.90 22887.19 00:32:23.609 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:23.609 Verification LBA range: start 0x800 length 0x800 00:32:23.609 crypto_ram : 5.00 5933.48 23.18 0.00 0.00 21468.42 1461.96 27424.30 00:32:23.609 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:23.609 Verification LBA range: start 0x0 length 0x800 00:32:23.609 crypto_ram3 : 5.02 3515.52 13.73 0.00 0.00 36204.50 1398.94 28634.19 00:32:23.609 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:23.609 Verification LBA range: start 0x800 length 0x800 00:32:23.609 crypto_ram3 : 5.02 2984.22 11.66 0.00 0.00 42644.30 1865.26 30852.33 00:32:23.609 =================================================================================================================== 00:32:23.609 Total : 19409.00 75.82 0.00 0.00 26255.76 1209.90 30852.33 00:32:23.609 00:32:23.609 real 0m5.592s 00:32:23.609 user 0m10.693s 00:32:23.609 sys 0m0.160s 00:32:23.609 16:08:43 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:23.609 16:08:43 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:23.609 ************************************ 00:32:23.609 END TEST bdev_verify 00:32:23.609 ************************************ 00:32:23.609 16:08:43 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:23.609 16:08:43 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:23.609 16:08:43 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:32:23.609 16:08:43 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:23.609 16:08:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:23.609 ************************************ 00:32:23.609 START TEST bdev_verify_big_io 00:32:23.609 ************************************ 00:32:23.609 16:08:43 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:23.609 [2024-07-12 16:08:43.919904] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:23.609 [2024-07-12 16:08:43.919949] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2725522 ] 00:32:23.609 [2024-07-12 16:08:44.007247] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:23.869 [2024-07-12 16:08:44.082391] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:23.869 [2024-07-12 16:08:44.082396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:23.869 [2024-07-12 16:08:44.220195] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:23.869 [2024-07-12 16:08:44.220248] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:23.869 [2024-07-12 16:08:44.220256] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:23.869 [2024-07-12 16:08:44.228212] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:23.869 [2024-07-12 16:08:44.228224] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:23.869 [2024-07-12 16:08:44.228230] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:23.869 [2024-07-12 16:08:44.236236] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:23.869 [2024-07-12 16:08:44.236246] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:23.869 [2024-07-12 16:08:44.236251] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:23.869 Running I/O for 5 seconds... 00:32:30.446 00:32:30.446 Latency(us) 00:32:30.446 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:30.446 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:30.446 Verification LBA range: start 0x0 length 0x80 00:32:30.446 crypto_ram : 5.30 459.08 28.69 0.00 0.00 272014.83 4032.98 387166.52 00:32:30.446 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:30.446 Verification LBA range: start 0x80 length 0x80 00:32:30.446 crypto_ram : 5.19 394.64 24.67 0.00 0.00 315565.91 3780.92 432335.95 00:32:30.446 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:30.446 Verification LBA range: start 0x0 length 0x80 00:32:30.446 crypto_ram3 : 5.31 241.24 15.08 0.00 0.00 499760.42 3150.77 398458.88 00:32:30.446 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:30.446 Verification LBA range: start 0x80 length 0x80 00:32:30.446 crypto_ram3 : 5.33 216.29 13.52 0.00 0.00 552666.42 3856.54 445241.50 00:32:30.446 =================================================================================================================== 00:32:30.446 Total : 1311.26 81.95 0.00 0.00 373869.23 3150.77 445241.50 00:32:30.446 00:32:30.446 real 0m5.891s 00:32:30.446 user 0m11.291s 00:32:30.446 sys 0m0.165s 00:32:30.446 16:08:49 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:30.446 16:08:49 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:30.446 ************************************ 00:32:30.446 END TEST bdev_verify_big_io 00:32:30.446 ************************************ 00:32:30.446 16:08:49 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:30.446 16:08:49 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:30.446 16:08:49 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:30.446 16:08:49 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:30.446 16:08:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:30.446 ************************************ 00:32:30.446 START TEST bdev_write_zeroes 00:32:30.446 ************************************ 00:32:30.446 16:08:49 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:30.446 [2024-07-12 16:08:49.893732] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:30.446 [2024-07-12 16:08:49.893786] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2726495 ] 00:32:30.446 [2024-07-12 16:08:49.983965] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:30.446 [2024-07-12 16:08:50.064635] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:30.446 [2024-07-12 16:08:50.205749] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:30.446 [2024-07-12 16:08:50.205796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:30.446 [2024-07-12 16:08:50.205804] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.446 [2024-07-12 16:08:50.213764] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:30.446 [2024-07-12 16:08:50.213776] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:30.446 [2024-07-12 16:08:50.213781] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.446 [2024-07-12 16:08:50.221786] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:30.446 [2024-07-12 16:08:50.221796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:30.446 [2024-07-12 16:08:50.221802] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.446 Running I/O for 1 seconds... 00:32:31.016 00:32:31.016 Latency(us) 00:32:31.016 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:31.016 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:31.016 crypto_ram : 1.01 33082.17 129.23 0.00 0.00 3860.75 1001.94 5394.12 00:32:31.016 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:31.016 crypto_ram3 : 1.01 16512.54 64.50 0.00 0.00 7707.04 4864.79 7965.14 00:32:31.016 =================================================================================================================== 00:32:31.016 Total : 49594.71 193.73 0.00 0.00 5142.85 1001.94 7965.14 00:32:31.016 00:32:31.016 real 0m1.558s 00:32:31.016 user 0m1.392s 00:32:31.016 sys 0m0.146s 00:32:31.016 16:08:51 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:31.016 16:08:51 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:31.016 ************************************ 00:32:31.016 END TEST bdev_write_zeroes 00:32:31.016 ************************************ 00:32:31.016 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:31.016 16:08:51 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:31.016 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:31.016 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:31.016 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:31.275 ************************************ 00:32:31.275 START TEST bdev_json_nonenclosed 00:32:31.275 ************************************ 00:32:31.275 16:08:51 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:31.275 [2024-07-12 16:08:51.530539] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:31.275 [2024-07-12 16:08:51.530588] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2726782 ] 00:32:31.275 [2024-07-12 16:08:51.619232] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:31.275 [2024-07-12 16:08:51.695679] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:31.276 [2024-07-12 16:08:51.695741] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:31.276 [2024-07-12 16:08:51.695755] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:31.276 [2024-07-12 16:08:51.695762] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:31.535 00:32:31.535 real 0m0.278s 00:32:31.535 user 0m0.172s 00:32:31.535 sys 0m0.104s 00:32:31.535 16:08:51 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:32:31.535 16:08:51 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:31.535 16:08:51 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:31.535 ************************************ 00:32:31.535 END TEST bdev_json_nonenclosed 00:32:31.535 ************************************ 00:32:31.535 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:32:31.535 16:08:51 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:32:31.535 16:08:51 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:31.535 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:32:31.535 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:31.535 16:08:51 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:31.535 ************************************ 00:32:31.535 START TEST bdev_json_nonarray 00:32:31.535 ************************************ 00:32:31.535 16:08:51 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:31.535 [2024-07-12 16:08:51.890004] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:31.535 [2024-07-12 16:08:51.890057] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2726884 ] 00:32:31.535 [2024-07-12 16:08:51.978439] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:31.795 [2024-07-12 16:08:52.055354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:31.795 [2024-07-12 16:08:52.055412] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:31.795 [2024-07-12 16:08:52.055423] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:31.795 [2024-07-12 16:08:52.055431] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:31.795 00:32:31.795 real 0m0.279s 00:32:31.795 user 0m0.170s 00:32:31.795 sys 0m0.107s 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:31.795 ************************************ 00:32:31.795 END TEST bdev_json_nonarray 00:32:31.795 ************************************ 00:32:31.795 16:08:52 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:32:31.795 16:08:52 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:32:31.795 16:08:52 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:32:31.795 16:08:52 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:32:31.795 16:08:52 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:32:31.795 16:08:52 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:32:31.795 16:08:52 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:32:31.795 16:08:52 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:31.795 16:08:52 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:31.795 ************************************ 00:32:31.795 START TEST bdev_crypto_enomem 00:32:31.795 ************************************ 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2727068 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2727068 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 2727068 ']' 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:31.795 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:31.795 16:08:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:32.056 [2024-07-12 16:08:52.260690] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:32.056 [2024-07-12 16:08:52.260749] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2727068 ] 00:32:32.056 [2024-07-12 16:08:52.340996] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:32.056 [2024-07-12 16:08:52.440570] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:32.995 true 00:32:32.995 base0 00:32:32.995 true 00:32:32.995 [2024-07-12 16:08:53.138601] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:32.995 crypt0 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:32.995 [ 00:32:32.995 { 00:32:32.995 "name": "crypt0", 00:32:32.995 "aliases": [ 00:32:32.995 "85fa4658-6f36-57e6-b2a4-9388a360abab" 00:32:32.995 ], 00:32:32.995 "product_name": "crypto", 00:32:32.995 "block_size": 512, 00:32:32.995 "num_blocks": 2097152, 00:32:32.995 "uuid": "85fa4658-6f36-57e6-b2a4-9388a360abab", 00:32:32.995 "assigned_rate_limits": { 00:32:32.995 "rw_ios_per_sec": 0, 00:32:32.995 "rw_mbytes_per_sec": 0, 00:32:32.995 "r_mbytes_per_sec": 0, 00:32:32.995 "w_mbytes_per_sec": 0 00:32:32.995 }, 00:32:32.995 "claimed": false, 00:32:32.995 "zoned": false, 00:32:32.995 "supported_io_types": { 00:32:32.995 "read": true, 00:32:32.995 "write": true, 00:32:32.995 "unmap": false, 00:32:32.995 "flush": false, 00:32:32.995 "reset": true, 00:32:32.995 "nvme_admin": false, 00:32:32.995 "nvme_io": false, 00:32:32.995 "nvme_io_md": false, 00:32:32.995 "write_zeroes": true, 00:32:32.995 "zcopy": false, 00:32:32.995 "get_zone_info": false, 00:32:32.995 "zone_management": false, 00:32:32.995 "zone_append": false, 00:32:32.995 "compare": false, 00:32:32.995 "compare_and_write": false, 00:32:32.995 "abort": false, 00:32:32.995 "seek_hole": false, 00:32:32.995 "seek_data": false, 00:32:32.995 "copy": false, 00:32:32.995 "nvme_iov_md": false 00:32:32.995 }, 00:32:32.995 "memory_domains": [ 00:32:32.995 { 00:32:32.995 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:32.995 "dma_device_type": 2 00:32:32.995 } 00:32:32.995 ], 00:32:32.995 "driver_specific": { 00:32:32.995 "crypto": { 00:32:32.995 "base_bdev_name": "EE_base0", 00:32:32.995 "name": "crypt0", 00:32:32.995 "key_name": "test_dek_sw" 00:32:32.995 } 00:32:32.995 } 00:32:32.995 } 00:32:32.995 ] 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2727116 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:32:32.995 16:08:53 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:32.995 Running I/O for 5 seconds... 00:32:33.936 16:08:54 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:32:33.936 16:08:54 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:33.936 16:08:54 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:33.936 16:08:54 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:33.936 16:08:54 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2727116 00:32:38.140 00:32:38.140 Latency(us) 00:32:38.140 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:38.140 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:32:38.140 crypt0 : 5.00 36195.43 141.39 0.00 0.00 880.24 441.11 1184.69 00:32:38.140 =================================================================================================================== 00:32:38.140 Total : 36195.43 141.39 0.00 0.00 880.24 441.11 1184.69 00:32:38.140 0 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2727068 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 2727068 ']' 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 2727068 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2727068 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2727068' 00:32:38.140 killing process with pid 2727068 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 2727068 00:32:38.140 Received shutdown signal, test time was about 5.000000 seconds 00:32:38.140 00:32:38.140 Latency(us) 00:32:38.140 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:38.140 =================================================================================================================== 00:32:38.140 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 2727068 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:32:38.140 00:32:38.140 real 0m6.331s 00:32:38.140 user 0m6.570s 00:32:38.140 sys 0m0.304s 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:38.140 16:08:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:38.140 ************************************ 00:32:38.140 END TEST bdev_crypto_enomem 00:32:38.140 ************************************ 00:32:38.140 16:08:58 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:32:38.140 16:08:58 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:32:38.140 00:32:38.140 real 0m51.275s 00:32:38.140 user 1m40.211s 00:32:38.140 sys 0m4.616s 00:32:38.140 16:08:58 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:38.140 16:08:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:38.140 ************************************ 00:32:38.140 END TEST blockdev_crypto_sw 00:32:38.140 ************************************ 00:32:38.404 16:08:58 -- common/autotest_common.sh@1142 -- # return 0 00:32:38.404 16:08:58 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:38.404 16:08:58 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:38.404 16:08:58 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:38.405 16:08:58 -- common/autotest_common.sh@10 -- # set +x 00:32:38.405 ************************************ 00:32:38.405 START TEST blockdev_crypto_qat 00:32:38.405 ************************************ 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:38.405 * Looking for test storage... 00:32:38.405 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2728086 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2728086 00:32:38.405 16:08:58 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 2728086 ']' 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:38.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:38.405 16:08:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:38.405 [2024-07-12 16:08:58.838141] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:38.405 [2024-07-12 16:08:58.838201] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728086 ] 00:32:38.708 [2024-07-12 16:08:58.929029] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:38.708 [2024-07-12 16:08:59.023207] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:39.279 16:08:59 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:39.279 16:08:59 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:32:39.279 16:08:59 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:39.279 16:08:59 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:32:39.279 16:08:59 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:32:39.279 16:08:59 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:39.279 16:08:59 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:39.279 [2024-07-12 16:08:59.701256] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:39.279 [2024-07-12 16:08:59.709279] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:39.279 [2024-07-12 16:08:59.717296] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:39.539 [2024-07-12 16:08:59.785225] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:42.082 true 00:32:42.082 true 00:32:42.082 true 00:32:42.082 true 00:32:42.082 Malloc0 00:32:42.082 Malloc1 00:32:42.082 Malloc2 00:32:42.082 Malloc3 00:32:42.082 [2024-07-12 16:09:02.209356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:42.082 crypto_ram 00:32:42.082 [2024-07-12 16:09:02.217376] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:42.082 crypto_ram1 00:32:42.082 [2024-07-12 16:09:02.225399] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:42.082 crypto_ram2 00:32:42.082 [2024-07-12 16:09:02.233421] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:42.082 crypto_ram3 00:32:42.082 [ 00:32:42.082 { 00:32:42.082 "name": "Malloc1", 00:32:42.082 "aliases": [ 00:32:42.082 "a0588427-c55f-43ac-9c2e-ccf878dcb821" 00:32:42.082 ], 00:32:42.082 "product_name": "Malloc disk", 00:32:42.082 "block_size": 512, 00:32:42.082 "num_blocks": 65536, 00:32:42.082 "uuid": "a0588427-c55f-43ac-9c2e-ccf878dcb821", 00:32:42.082 "assigned_rate_limits": { 00:32:42.082 "rw_ios_per_sec": 0, 00:32:42.082 "rw_mbytes_per_sec": 0, 00:32:42.082 "r_mbytes_per_sec": 0, 00:32:42.082 "w_mbytes_per_sec": 0 00:32:42.082 }, 00:32:42.082 "claimed": true, 00:32:42.082 "claim_type": "exclusive_write", 00:32:42.082 "zoned": false, 00:32:42.082 "supported_io_types": { 00:32:42.082 "read": true, 00:32:42.082 "write": true, 00:32:42.082 "unmap": true, 00:32:42.082 "flush": true, 00:32:42.082 "reset": true, 00:32:42.082 "nvme_admin": false, 00:32:42.082 "nvme_io": false, 00:32:42.082 "nvme_io_md": false, 00:32:42.082 "write_zeroes": true, 00:32:42.082 "zcopy": true, 00:32:42.082 "get_zone_info": false, 00:32:42.082 "zone_management": false, 00:32:42.082 "zone_append": false, 00:32:42.082 "compare": false, 00:32:42.082 "compare_and_write": false, 00:32:42.082 "abort": true, 00:32:42.082 "seek_hole": false, 00:32:42.082 "seek_data": false, 00:32:42.082 "copy": true, 00:32:42.082 "nvme_iov_md": false 00:32:42.082 }, 00:32:42.082 "memory_domains": [ 00:32:42.082 { 00:32:42.082 "dma_device_id": "system", 00:32:42.082 "dma_device_type": 1 00:32:42.082 }, 00:32:42.082 { 00:32:42.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:42.082 "dma_device_type": 2 00:32:42.082 } 00:32:42.082 ], 00:32:42.082 "driver_specific": {} 00:32:42.082 } 00:32:42.082 ] 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.082 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.082 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:32:42.082 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.082 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:42.082 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "15907dd0-9583-53cd-97f0-26e9320557e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15907dd0-9583-53cd-97f0-26e9320557e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "12e3cbea-fddb-5979-960a-6665bf18e658"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "12e3cbea-fddb-5979-960a-6665bf18e658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3b8603a6-9440-5b94-b136-3f19cae2ef4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b8603a6-9440-5b94-b136-3f19cae2ef4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc8b5ea0-20b8-5e10-8516-be9207feffbd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dc8b5ea0-20b8-5e10-8516-be9207feffbd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:42.083 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2728086 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 2728086 ']' 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 2728086 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2728086 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2728086' 00:32:42.083 killing process with pid 2728086 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 2728086 00:32:42.083 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 2728086 00:32:42.651 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:42.651 16:09:02 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:42.651 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:32:42.651 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:42.651 16:09:02 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:42.651 ************************************ 00:32:42.651 START TEST bdev_hello_world 00:32:42.651 ************************************ 00:32:42.651 16:09:02 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:42.651 [2024-07-12 16:09:02.986197] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:42.651 [2024-07-12 16:09:02.986252] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2728834 ] 00:32:42.651 [2024-07-12 16:09:03.076386] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:42.909 [2024-07-12 16:09:03.170356] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:42.909 [2024-07-12 16:09:03.191461] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:42.909 [2024-07-12 16:09:03.199486] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:42.909 [2024-07-12 16:09:03.207504] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:42.909 [2024-07-12 16:09:03.310077] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:45.449 [2024-07-12 16:09:05.560859] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:45.449 [2024-07-12 16:09:05.560910] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:45.449 [2024-07-12 16:09:05.560919] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.449 [2024-07-12 16:09:05.568877] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:45.449 [2024-07-12 16:09:05.568887] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:45.449 [2024-07-12 16:09:05.568893] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.449 [2024-07-12 16:09:05.576896] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:45.449 [2024-07-12 16:09:05.576906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:45.449 [2024-07-12 16:09:05.576911] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.449 [2024-07-12 16:09:05.584916] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:45.449 [2024-07-12 16:09:05.584925] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:45.449 [2024-07-12 16:09:05.584931] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:45.449 [2024-07-12 16:09:05.646390] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:45.449 [2024-07-12 16:09:05.646417] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:45.449 [2024-07-12 16:09:05.646427] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:45.449 [2024-07-12 16:09:05.647453] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:45.449 [2024-07-12 16:09:05.647504] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:45.449 [2024-07-12 16:09:05.647514] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:45.449 [2024-07-12 16:09:05.647546] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:45.449 00:32:45.449 [2024-07-12 16:09:05.647556] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:45.449 00:32:45.449 real 0m2.952s 00:32:45.449 user 0m2.568s 00:32:45.449 sys 0m0.335s 00:32:45.449 16:09:05 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:45.449 16:09:05 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:45.449 ************************************ 00:32:45.449 END TEST bdev_hello_world 00:32:45.449 ************************************ 00:32:45.709 16:09:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:32:45.709 16:09:05 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:45.709 16:09:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:32:45.709 16:09:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:45.709 16:09:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:45.709 ************************************ 00:32:45.709 START TEST bdev_bounds 00:32:45.709 ************************************ 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2729453 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2729453' 00:32:45.709 Process bdevio pid: 2729453 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2729453 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 2729453 ']' 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:45.709 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:45.709 16:09:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:45.709 [2024-07-12 16:09:06.016884] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:45.709 [2024-07-12 16:09:06.016931] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2729453 ] 00:32:45.709 [2024-07-12 16:09:06.105891] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:45.968 [2024-07-12 16:09:06.171199] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:45.968 [2024-07-12 16:09:06.171345] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:45.968 [2024-07-12 16:09:06.171347] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:45.968 [2024-07-12 16:09:06.192418] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:45.968 [2024-07-12 16:09:06.200445] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:45.968 [2024-07-12 16:09:06.208464] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:45.968 [2024-07-12 16:09:06.291951] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:48.510 [2024-07-12 16:09:08.443442] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:48.510 [2024-07-12 16:09:08.443501] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:48.510 [2024-07-12 16:09:08.443510] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.510 [2024-07-12 16:09:08.451462] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:48.510 [2024-07-12 16:09:08.451473] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:48.510 [2024-07-12 16:09:08.451479] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.510 [2024-07-12 16:09:08.459482] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:48.510 [2024-07-12 16:09:08.459492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:48.510 [2024-07-12 16:09:08.459498] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.510 [2024-07-12 16:09:08.467501] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:48.510 [2024-07-12 16:09:08.467512] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:48.510 [2024-07-12 16:09:08.467518] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:48.510 16:09:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:48.510 16:09:08 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:32:48.510 16:09:08 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:48.510 I/O targets: 00:32:48.510 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:48.510 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:32:48.510 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:32:48.510 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:48.510 00:32:48.510 00:32:48.510 CUnit - A unit testing framework for C - Version 2.1-3 00:32:48.510 http://cunit.sourceforge.net/ 00:32:48.510 00:32:48.510 00:32:48.510 Suite: bdevio tests on: crypto_ram3 00:32:48.510 Test: blockdev write read block ...passed 00:32:48.510 Test: blockdev write zeroes read block ...passed 00:32:48.510 Test: blockdev write zeroes read no split ...passed 00:32:48.510 Test: blockdev write zeroes read split ...passed 00:32:48.510 Test: blockdev write zeroes read split partial ...passed 00:32:48.510 Test: blockdev reset ...passed 00:32:48.510 Test: blockdev write read 8 blocks ...passed 00:32:48.510 Test: blockdev write read size > 128k ...passed 00:32:48.510 Test: blockdev write read invalid size ...passed 00:32:48.510 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:48.510 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:48.510 Test: blockdev write read max offset ...passed 00:32:48.510 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:48.510 Test: blockdev writev readv 8 blocks ...passed 00:32:48.510 Test: blockdev writev readv 30 x 1block ...passed 00:32:48.510 Test: blockdev writev readv block ...passed 00:32:48.510 Test: blockdev writev readv size > 128k ...passed 00:32:48.510 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:48.510 Test: blockdev comparev and writev ...passed 00:32:48.510 Test: blockdev nvme passthru rw ...passed 00:32:48.510 Test: blockdev nvme passthru vendor specific ...passed 00:32:48.510 Test: blockdev nvme admin passthru ...passed 00:32:48.510 Test: blockdev copy ...passed 00:32:48.510 Suite: bdevio tests on: crypto_ram2 00:32:48.510 Test: blockdev write read block ...passed 00:32:48.510 Test: blockdev write zeroes read block ...passed 00:32:48.510 Test: blockdev write zeroes read no split ...passed 00:32:48.510 Test: blockdev write zeroes read split ...passed 00:32:48.511 Test: blockdev write zeroes read split partial ...passed 00:32:48.511 Test: blockdev reset ...passed 00:32:48.511 Test: blockdev write read 8 blocks ...passed 00:32:48.511 Test: blockdev write read size > 128k ...passed 00:32:48.511 Test: blockdev write read invalid size ...passed 00:32:48.511 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:48.511 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:48.511 Test: blockdev write read max offset ...passed 00:32:48.511 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:48.511 Test: blockdev writev readv 8 blocks ...passed 00:32:48.511 Test: blockdev writev readv 30 x 1block ...passed 00:32:48.511 Test: blockdev writev readv block ...passed 00:32:48.511 Test: blockdev writev readv size > 128k ...passed 00:32:48.511 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:48.511 Test: blockdev comparev and writev ...passed 00:32:48.511 Test: blockdev nvme passthru rw ...passed 00:32:48.511 Test: blockdev nvme passthru vendor specific ...passed 00:32:48.511 Test: blockdev nvme admin passthru ...passed 00:32:48.511 Test: blockdev copy ...passed 00:32:48.511 Suite: bdevio tests on: crypto_ram1 00:32:48.511 Test: blockdev write read block ...passed 00:32:48.511 Test: blockdev write zeroes read block ...passed 00:32:48.511 Test: blockdev write zeroes read no split ...passed 00:32:48.772 Test: blockdev write zeroes read split ...passed 00:32:48.772 Test: blockdev write zeroes read split partial ...passed 00:32:48.772 Test: blockdev reset ...passed 00:32:48.772 Test: blockdev write read 8 blocks ...passed 00:32:48.772 Test: blockdev write read size > 128k ...passed 00:32:48.772 Test: blockdev write read invalid size ...passed 00:32:48.772 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:48.772 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:48.772 Test: blockdev write read max offset ...passed 00:32:48.772 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:48.772 Test: blockdev writev readv 8 blocks ...passed 00:32:48.772 Test: blockdev writev readv 30 x 1block ...passed 00:32:48.772 Test: blockdev writev readv block ...passed 00:32:48.772 Test: blockdev writev readv size > 128k ...passed 00:32:48.772 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:48.772 Test: blockdev comparev and writev ...passed 00:32:48.772 Test: blockdev nvme passthru rw ...passed 00:32:48.772 Test: blockdev nvme passthru vendor specific ...passed 00:32:48.772 Test: blockdev nvme admin passthru ...passed 00:32:48.772 Test: blockdev copy ...passed 00:32:48.772 Suite: bdevio tests on: crypto_ram 00:32:48.772 Test: blockdev write read block ...passed 00:32:48.772 Test: blockdev write zeroes read block ...passed 00:32:49.032 Test: blockdev write zeroes read no split ...passed 00:32:49.032 Test: blockdev write zeroes read split ...passed 00:32:49.292 Test: blockdev write zeroes read split partial ...passed 00:32:49.292 Test: blockdev reset ...passed 00:32:49.292 Test: blockdev write read 8 blocks ...passed 00:32:49.292 Test: blockdev write read size > 128k ...passed 00:32:49.292 Test: blockdev write read invalid size ...passed 00:32:49.292 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:49.292 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:49.292 Test: blockdev write read max offset ...passed 00:32:49.292 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:49.292 Test: blockdev writev readv 8 blocks ...passed 00:32:49.292 Test: blockdev writev readv 30 x 1block ...passed 00:32:49.292 Test: blockdev writev readv block ...passed 00:32:49.292 Test: blockdev writev readv size > 128k ...passed 00:32:49.292 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:49.292 Test: blockdev comparev and writev ...passed 00:32:49.292 Test: blockdev nvme passthru rw ...passed 00:32:49.292 Test: blockdev nvme passthru vendor specific ...passed 00:32:49.292 Test: blockdev nvme admin passthru ...passed 00:32:49.292 Test: blockdev copy ...passed 00:32:49.292 00:32:49.292 Run Summary: Type Total Ran Passed Failed Inactive 00:32:49.292 suites 4 4 n/a 0 0 00:32:49.292 tests 92 92 92 0 0 00:32:49.292 asserts 520 520 520 0 n/a 00:32:49.292 00:32:49.292 Elapsed time = 1.839 seconds 00:32:49.292 0 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2729453 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 2729453 ']' 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 2729453 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2729453 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2729453' 00:32:49.292 killing process with pid 2729453 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 2729453 00:32:49.292 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 2729453 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:49.554 00:32:49.554 real 0m3.934s 00:32:49.554 user 0m10.644s 00:32:49.554 sys 0m0.389s 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:49.554 ************************************ 00:32:49.554 END TEST bdev_bounds 00:32:49.554 ************************************ 00:32:49.554 16:09:09 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:32:49.554 16:09:09 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:49.554 16:09:09 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:32:49.554 16:09:09 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:32:49.554 16:09:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:49.554 ************************************ 00:32:49.554 START TEST bdev_nbd 00:32:49.554 ************************************ 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2730385 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2730385 /var/tmp/spdk-nbd.sock 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 2730385 ']' 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:49.554 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:49.554 16:09:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:49.813 [2024-07-12 16:09:10.040180] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:32:49.813 [2024-07-12 16:09:10.040236] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:49.813 [2024-07-12 16:09:10.132232] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:49.813 [2024-07-12 16:09:10.199929] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:49.813 [2024-07-12 16:09:10.220941] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:49.813 [2024-07-12 16:09:10.228963] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:49.813 [2024-07-12 16:09:10.236981] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:50.073 [2024-07-12 16:09:10.325512] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:52.615 [2024-07-12 16:09:12.483186] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:52.615 [2024-07-12 16:09:12.483230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:52.615 [2024-07-12 16:09:12.483238] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.615 [2024-07-12 16:09:12.491203] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:52.615 [2024-07-12 16:09:12.491214] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:52.615 [2024-07-12 16:09:12.491220] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.615 [2024-07-12 16:09:12.499223] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:52.615 [2024-07-12 16:09:12.499233] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:52.615 [2024-07-12 16:09:12.499239] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.615 [2024-07-12 16:09:12.507242] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:52.615 [2024-07-12 16:09:12.507251] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:52.615 [2024-07-12 16:09:12.507257] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:52.615 16:09:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:52.615 16:09:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:52.616 16:09:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:52.875 1+0 records in 00:32:52.875 1+0 records out 00:32:52.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329261 s, 12.4 MB/s 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:52.875 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:53.134 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:53.134 1+0 records in 00:32:53.134 1+0 records out 00:32:53.134 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284413 s, 14.4 MB/s 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:53.135 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:53.395 1+0 records in 00:32:53.395 1+0 records out 00:32:53.395 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000317774 s, 12.9 MB/s 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:53.395 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:53.395 1+0 records in 00:32:53.395 1+0 records out 00:32:53.395 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289586 s, 14.1 MB/s 00:32:53.655 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.655 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:53.655 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:53.656 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:53.656 16:09:13 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:53.656 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:53.656 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:53.656 16:09:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd0", 00:32:53.656 "bdev_name": "crypto_ram" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd1", 00:32:53.656 "bdev_name": "crypto_ram1" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd2", 00:32:53.656 "bdev_name": "crypto_ram2" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd3", 00:32:53.656 "bdev_name": "crypto_ram3" 00:32:53.656 } 00:32:53.656 ]' 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd0", 00:32:53.656 "bdev_name": "crypto_ram" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd1", 00:32:53.656 "bdev_name": "crypto_ram1" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd2", 00:32:53.656 "bdev_name": "crypto_ram2" 00:32:53.656 }, 00:32:53.656 { 00:32:53.656 "nbd_device": "/dev/nbd3", 00:32:53.656 "bdev_name": "crypto_ram3" 00:32:53.656 } 00:32:53.656 ]' 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:53.656 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:53.916 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:54.176 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:54.437 16:09:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:54.698 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:55.268 /dev/nbd0 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:55.268 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:55.269 1+0 records in 00:32:55.269 1+0 records out 00:32:55.269 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00035009 s, 11.7 MB/s 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:55.269 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:32:55.529 /dev/nbd1 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:55.529 1+0 records in 00:32:55.529 1+0 records out 00:32:55.529 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278983 s, 14.7 MB/s 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:55.529 16:09:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:32:55.789 /dev/nbd10 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:55.789 1+0 records in 00:32:55.789 1+0 records out 00:32:55.789 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269764 s, 15.2 MB/s 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:55.789 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:32:56.049 /dev/nbd11 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:56.049 1+0 records in 00:32:56.049 1+0 records out 00:32:56.049 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272057 s, 15.1 MB/s 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:56.049 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd0", 00:32:56.311 "bdev_name": "crypto_ram" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd1", 00:32:56.311 "bdev_name": "crypto_ram1" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd10", 00:32:56.311 "bdev_name": "crypto_ram2" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd11", 00:32:56.311 "bdev_name": "crypto_ram3" 00:32:56.311 } 00:32:56.311 ]' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd0", 00:32:56.311 "bdev_name": "crypto_ram" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd1", 00:32:56.311 "bdev_name": "crypto_ram1" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd10", 00:32:56.311 "bdev_name": "crypto_ram2" 00:32:56.311 }, 00:32:56.311 { 00:32:56.311 "nbd_device": "/dev/nbd11", 00:32:56.311 "bdev_name": "crypto_ram3" 00:32:56.311 } 00:32:56.311 ]' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:56.311 /dev/nbd1 00:32:56.311 /dev/nbd10 00:32:56.311 /dev/nbd11' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:56.311 /dev/nbd1 00:32:56.311 /dev/nbd10 00:32:56.311 /dev/nbd11' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:56.311 256+0 records in 00:32:56.311 256+0 records out 00:32:56.311 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.012523 s, 83.7 MB/s 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:56.311 256+0 records in 00:32:56.311 256+0 records out 00:32:56.311 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0483651 s, 21.7 MB/s 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:56.311 256+0 records in 00:32:56.311 256+0 records out 00:32:56.311 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0456161 s, 23.0 MB/s 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:56.311 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:56.573 256+0 records in 00:32:56.573 256+0 records out 00:32:56.573 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.029333 s, 35.7 MB/s 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:56.573 256+0 records in 00:32:56.573 256+0 records out 00:32:56.573 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0430483 s, 24.4 MB/s 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:56.573 16:09:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:56.833 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:57.404 16:09:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:57.974 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:58.235 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:58.523 malloc_lvol_verify 00:32:58.523 16:09:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:59.115 a2d8e96c-af2f-497b-b954-3ccb0c385ef4 00:32:59.115 16:09:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:59.115 591e5603-7700-434b-9c38-f28e9c3a3a2b 00:32:59.116 16:09:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:59.686 /dev/nbd0 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:59.686 mke2fs 1.46.5 (30-Dec-2021) 00:32:59.686 Discarding device blocks: 0/4096 done 00:32:59.686 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:59.686 00:32:59.686 Allocating group tables: 0/1 done 00:32:59.686 Writing inode tables: 0/1 done 00:32:59.686 Creating journal (1024 blocks): done 00:32:59.686 Writing superblocks and filesystem accounting information: 0/1 done 00:32:59.686 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:59.686 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2730385 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 2730385 ']' 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 2730385 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2730385 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2730385' 00:32:59.946 killing process with pid 2730385 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 2730385 00:32:59.946 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 2730385 00:33:00.207 16:09:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:33:00.207 00:33:00.207 real 0m10.618s 00:33:00.207 user 0m15.497s 00:33:00.207 sys 0m2.824s 00:33:00.207 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:00.207 16:09:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:33:00.207 ************************************ 00:33:00.207 END TEST bdev_nbd 00:33:00.207 ************************************ 00:33:00.207 16:09:20 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:00.207 16:09:20 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:33:00.207 16:09:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:33:00.207 16:09:20 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:33:00.207 16:09:20 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:33:00.207 16:09:20 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:00.207 16:09:20 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.207 16:09:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:00.469 ************************************ 00:33:00.469 START TEST bdev_fio 00:33:00.469 ************************************ 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:00.469 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:00.469 ************************************ 00:33:00.469 START TEST bdev_fio_rw_verify 00:33:00.469 ************************************ 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:00.469 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:00.470 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:00.470 16:09:20 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.728 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.728 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.728 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.728 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.728 fio-3.35 00:33:00.728 Starting 4 threads 00:33:15.620 00:33:15.620 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2733035: Fri Jul 12 16:09:33 2024 00:33:15.620 read: IOPS=34.7k, BW=135MiB/s (142MB/s)(1354MiB/10001msec) 00:33:15.620 slat (usec): min=14, max=722, avg=37.82, stdev=25.00 00:33:15.620 clat (usec): min=19, max=1245, avg=226.18, stdev=157.48 00:33:15.620 lat (usec): min=37, max=1348, avg=264.00, stdev=171.11 00:33:15.620 clat percentiles (usec): 00:33:15.620 | 50.000th=[ 174], 99.000th=[ 766], 99.900th=[ 963], 99.990th=[ 1123], 00:33:15.620 | 99.999th=[ 1205] 00:33:15.620 write: IOPS=38.0k, BW=148MiB/s (156MB/s)(1445MiB/9736msec); 0 zone resets 00:33:15.620 slat (usec): min=15, max=365, avg=47.67, stdev=24.99 00:33:15.620 clat (usec): min=16, max=1999, avg=257.55, stdev=161.02 00:33:15.620 lat (usec): min=45, max=2149, avg=305.21, stdev=174.89 00:33:15.620 clat percentiles (usec): 00:33:15.620 | 50.000th=[ 217], 99.000th=[ 791], 99.900th=[ 988], 99.990th=[ 1303], 00:33:15.620 | 99.999th=[ 1844] 00:33:15.620 bw ( KiB/s): min=127120, max=169536, per=97.69%, avg=148524.16, stdev=3187.70, samples=76 00:33:15.620 iops : min=31780, max=42384, avg=37131.00, stdev=796.93, samples=76 00:33:15.620 lat (usec) : 20=0.01%, 50=0.04%, 100=14.52%, 250=48.68%, 500=28.60% 00:33:15.620 lat (usec) : 750=6.83%, 1000=1.25% 00:33:15.620 lat (msec) : 2=0.08% 00:33:15.620 cpu : usr=99.72%, sys=0.00%, ctx=72, majf=0, minf=271 00:33:15.620 IO depths : 1=0.2%, 2=28.6%, 4=57.0%, 8=14.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:15.620 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:15.620 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:15.620 issued rwts: total=346638,370045,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:15.620 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:15.620 00:33:15.620 Run status group 0 (all jobs): 00:33:15.620 READ: bw=135MiB/s (142MB/s), 135MiB/s-135MiB/s (142MB/s-142MB/s), io=1354MiB (1420MB), run=10001-10001msec 00:33:15.620 WRITE: bw=148MiB/s (156MB/s), 148MiB/s-148MiB/s (156MB/s-156MB/s), io=1445MiB (1516MB), run=9736-9736msec 00:33:15.620 00:33:15.620 real 0m13.311s 00:33:15.620 user 0m52.187s 00:33:15.620 sys 0m0.378s 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:15.620 ************************************ 00:33:15.620 END TEST bdev_fio_rw_verify 00:33:15.620 ************************************ 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:15.620 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "15907dd0-9583-53cd-97f0-26e9320557e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15907dd0-9583-53cd-97f0-26e9320557e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "12e3cbea-fddb-5979-960a-6665bf18e658"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "12e3cbea-fddb-5979-960a-6665bf18e658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3b8603a6-9440-5b94-b136-3f19cae2ef4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b8603a6-9440-5b94-b136-3f19cae2ef4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc8b5ea0-20b8-5e10-8516-be9207feffbd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dc8b5ea0-20b8-5e10-8516-be9207feffbd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:33:15.621 crypto_ram1 00:33:15.621 crypto_ram2 00:33:15.621 crypto_ram3 ]] 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "15907dd0-9583-53cd-97f0-26e9320557e0"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "15907dd0-9583-53cd-97f0-26e9320557e0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "12e3cbea-fddb-5979-960a-6665bf18e658"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "12e3cbea-fddb-5979-960a-6665bf18e658",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3b8603a6-9440-5b94-b136-3f19cae2ef4e"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3b8603a6-9440-5b94-b136-3f19cae2ef4e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "dc8b5ea0-20b8-5e10-8516-be9207feffbd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "dc8b5ea0-20b8-5e10-8516-be9207feffbd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:15.621 ************************************ 00:33:15.621 START TEST bdev_fio_trim 00:33:15.621 ************************************ 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:15.621 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:15.622 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:15.622 16:09:34 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:15.622 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:15.622 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:15.622 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:15.622 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:15.622 fio-3.35 00:33:15.622 Starting 4 threads 00:33:27.850 00:33:27.850 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2735357: Fri Jul 12 16:09:47 2024 00:33:27.850 write: IOPS=66.1k, BW=258MiB/s (271MB/s)(2583MiB/10001msec); 0 zone resets 00:33:27.850 slat (usec): min=14, max=746, avg=37.39, stdev=24.17 00:33:27.850 clat (usec): min=22, max=916, avg=127.52, stdev=75.39 00:33:27.850 lat (usec): min=40, max=1105, avg=164.91, stdev=88.74 00:33:27.850 clat percentiles (usec): 00:33:27.850 | 50.000th=[ 112], 99.000th=[ 375], 99.900th=[ 490], 99.990th=[ 594], 00:33:27.850 | 99.999th=[ 865] 00:33:27.850 bw ( KiB/s): min=228096, max=284480, per=100.00%, avg=265913.21, stdev=5920.36, samples=76 00:33:27.850 iops : min=57024, max=71120, avg=66478.26, stdev=1480.10, samples=76 00:33:27.850 trim: IOPS=66.1k, BW=258MiB/s (271MB/s)(2583MiB/10001msec); 0 zone resets 00:33:27.850 slat (nsec): min=4986, max=50020, avg=7719.33, stdev=3267.27 00:33:27.850 clat (usec): min=40, max=1105, avg=165.09, stdev=88.75 00:33:27.850 lat (usec): min=46, max=1124, avg=172.80, stdev=89.36 00:33:27.850 clat percentiles (usec): 00:33:27.850 | 50.000th=[ 143], 99.000th=[ 453], 99.900th=[ 578], 99.990th=[ 709], 00:33:27.850 | 99.999th=[ 1037] 00:33:27.850 bw ( KiB/s): min=228096, max=284480, per=100.00%, avg=265913.21, stdev=5920.36, samples=76 00:33:27.850 iops : min=57024, max=71120, avg=66478.26, stdev=1480.10, samples=76 00:33:27.850 lat (usec) : 50=5.11%, 100=28.05%, 250=54.92%, 500=11.66%, 750=0.26% 00:33:27.850 lat (usec) : 1000=0.01% 00:33:27.850 lat (msec) : 2=0.01% 00:33:27.850 cpu : usr=99.72%, sys=0.00%, ctx=47, majf=0, minf=89 00:33:27.850 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:27.850 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:27.850 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:27.850 issued rwts: total=0,661360,661360,0 short=0,0,0,0 dropped=0,0,0,0 00:33:27.850 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:27.850 00:33:27.850 Run status group 0 (all jobs): 00:33:27.850 WRITE: bw=258MiB/s (271MB/s), 258MiB/s-258MiB/s (271MB/s-271MB/s), io=2583MiB (2709MB), run=10001-10001msec 00:33:27.850 TRIM: bw=258MiB/s (271MB/s), 258MiB/s-258MiB/s (271MB/s-271MB/s), io=2583MiB (2709MB), run=10001-10001msec 00:33:27.850 00:33:27.850 real 0m13.400s 00:33:27.850 user 0m52.147s 00:33:27.850 sys 0m0.353s 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:27.850 ************************************ 00:33:27.850 END TEST bdev_fio_trim 00:33:27.850 ************************************ 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:33:27.850 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:33:27.850 00:33:27.850 real 0m27.052s 00:33:27.850 user 1m44.516s 00:33:27.850 sys 0m0.906s 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:27.850 ************************************ 00:33:27.850 END TEST bdev_fio 00:33:27.850 ************************************ 00:33:27.850 16:09:47 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:27.850 16:09:47 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:27.850 16:09:47 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:27.850 16:09:47 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:27.850 16:09:47 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:27.850 16:09:47 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:27.850 ************************************ 00:33:27.850 START TEST bdev_verify 00:33:27.850 ************************************ 00:33:27.850 16:09:47 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:27.850 [2024-07-12 16:09:47.857009] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:27.850 [2024-07-12 16:09:47.857061] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2737174 ] 00:33:27.850 [2024-07-12 16:09:47.948951] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:27.850 [2024-07-12 16:09:48.044377] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:27.850 [2024-07-12 16:09:48.044382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:27.850 [2024-07-12 16:09:48.065590] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:27.850 [2024-07-12 16:09:48.073620] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:27.850 [2024-07-12 16:09:48.081644] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:27.850 [2024-07-12 16:09:48.182852] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:30.395 [2024-07-12 16:09:50.440701] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:30.395 [2024-07-12 16:09:50.440798] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:30.395 [2024-07-12 16:09:50.440808] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:30.395 [2024-07-12 16:09:50.448722] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:30.395 [2024-07-12 16:09:50.448735] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:30.395 [2024-07-12 16:09:50.448742] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:30.395 [2024-07-12 16:09:50.456743] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:30.395 [2024-07-12 16:09:50.456755] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:30.395 [2024-07-12 16:09:50.456760] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:30.395 [2024-07-12 16:09:50.464756] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:30.395 [2024-07-12 16:09:50.464768] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:30.395 [2024-07-12 16:09:50.464773] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:30.395 Running I/O for 5 seconds... 00:33:35.679 00:33:35.679 Latency(us) 00:33:35.679 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:35.679 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:35.679 Verification LBA range: start 0x0 length 0x1000 00:33:35.679 crypto_ram : 5.06 607.32 2.37 0.00 0.00 210396.96 5368.91 121796.14 00:33:35.679 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:35.679 Verification LBA range: start 0x1000 length 0x1000 00:33:35.679 crypto_ram : 5.07 504.96 1.97 0.00 0.00 253003.04 4587.52 148413.83 00:33:35.679 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:35.679 Verification LBA range: start 0x0 length 0x1000 00:33:35.679 crypto_ram1 : 5.06 606.86 2.37 0.00 0.00 209918.70 6200.71 120182.94 00:33:35.679 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:35.679 Verification LBA range: start 0x1000 length 0x1000 00:33:35.680 crypto_ram1 : 5.07 504.84 1.97 0.00 0.00 252323.78 4940.41 141154.46 00:33:35.680 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:35.680 Verification LBA range: start 0x0 length 0x1000 00:33:35.680 crypto_ram2 : 5.05 4753.57 18.57 0.00 0.00 26702.57 2886.10 24298.73 00:33:35.680 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:35.680 Verification LBA range: start 0x1000 length 0x1000 00:33:35.680 crypto_ram2 : 5.06 3923.66 15.33 0.00 0.00 32344.15 5620.97 27222.65 00:33:35.680 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:35.680 Verification LBA range: start 0x0 length 0x1000 00:33:35.680 crypto_ram3 : 5.05 4761.33 18.60 0.00 0.00 26633.65 2797.88 24399.56 00:33:35.680 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:35.680 Verification LBA range: start 0x1000 length 0x1000 00:33:35.680 crypto_ram3 : 5.06 3922.37 15.32 0.00 0.00 32289.65 4688.34 27424.30 00:33:35.680 =================================================================================================================== 00:33:35.680 Total : 19584.91 76.50 0.00 0.00 52005.44 2797.88 148413.83 00:33:35.680 00:33:35.680 real 0m8.090s 00:33:35.680 user 0m15.441s 00:33:35.680 sys 0m0.352s 00:33:35.680 16:09:55 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:35.680 16:09:55 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:35.680 ************************************ 00:33:35.680 END TEST bdev_verify 00:33:35.680 ************************************ 00:33:35.680 16:09:55 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:35.680 16:09:55 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:35.680 16:09:55 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:33:35.680 16:09:55 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:35.680 16:09:55 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:35.680 ************************************ 00:33:35.680 START TEST bdev_verify_big_io 00:33:35.680 ************************************ 00:33:35.680 16:09:55 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:35.680 [2024-07-12 16:09:56.026279] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:35.680 [2024-07-12 16:09:56.026328] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2738428 ] 00:33:35.680 [2024-07-12 16:09:56.115750] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:35.939 [2024-07-12 16:09:56.192346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:35.939 [2024-07-12 16:09:56.192351] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:35.939 [2024-07-12 16:09:56.213430] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:35.939 [2024-07-12 16:09:56.221459] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:35.939 [2024-07-12 16:09:56.229482] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:35.939 [2024-07-12 16:09:56.315626] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:38.597 [2024-07-12 16:09:58.468508] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:38.597 [2024-07-12 16:09:58.468563] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:38.597 [2024-07-12 16:09:58.468571] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.597 [2024-07-12 16:09:58.476525] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:38.597 [2024-07-12 16:09:58.476537] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:38.597 [2024-07-12 16:09:58.476542] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.597 [2024-07-12 16:09:58.484545] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:38.597 [2024-07-12 16:09:58.484556] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:38.597 [2024-07-12 16:09:58.484561] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.597 [2024-07-12 16:09:58.492565] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:38.597 [2024-07-12 16:09:58.492576] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:38.597 [2024-07-12 16:09:58.492581] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:38.597 Running I/O for 5 seconds... 00:33:39.175 [2024-07-12 16:09:59.326307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.326704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.326768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.326830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.326875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.326912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.327394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.327404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.330565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.330620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.330658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.330695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.331639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.334630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.334675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.334715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.334752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.335740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.338725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.338765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.338801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.338839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.339784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.342895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.342935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.342971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.343010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.175 [2024-07-12 16:09:59.343493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.343530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.343566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.343612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.344125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.344135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.347681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.348120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.348130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.351643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.352052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.352062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.355706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.356247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.356257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.359828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.360194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.360203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.363983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.364368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.364378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.367789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.368209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.368219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.371902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.372295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.372304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.375139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.176 [2024-07-12 16:09:59.375180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.375741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.376116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.376126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.378842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.378883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.378919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.378955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.379951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.382833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.382872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.382908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.382958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.383936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.386773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.386812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.386849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.386888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.387803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.390438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.390478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.390514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.390554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.390982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.391021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.391058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.391108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.391506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.391516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.394984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.395020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.395056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.395437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.395447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.398763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.399148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.399160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.401808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.401851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.401887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.401923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.402897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.405562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.405603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.405640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.405679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.406133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.177 [2024-07-12 16:09:59.406172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.406209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.406245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.406638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.406648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.409407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.409449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.409485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.409521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.409989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.410028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.410064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.410099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.410475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.410485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.412958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.413226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.413236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.415868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.415916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.415953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.415991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.416679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.418806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.418845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.418881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.418919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.419977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.422933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.425968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.426005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.426272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.426282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.428880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.429271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.429281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.431616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.431655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.431691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.431730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.432273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.432310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.432346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.432382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.178 [2024-07-12 16:09:59.432759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.432770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.435838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.436109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.436119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.438551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.439008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.439019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.441821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.442090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.442099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.444377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.444760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.445136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.446713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.448495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.449958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.451039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.452596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.452900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.452910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.455714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.456756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.458036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.459499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.461303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.462403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.463679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.465147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.465419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.465429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.469330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.470730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.472188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.473655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.475614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.476915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.478384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.479844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.480248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.480257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.484115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.485591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.487059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.487965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.489666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.491129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.492596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.493318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.493795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.493807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.497308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.498804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.499924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.501213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.502949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.504411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.504790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.505165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.505636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.505648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.508584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.509860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.511318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.512780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.513420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.513799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.514175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.514703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.515026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.515037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.518405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.519933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.520312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.520687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.521521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.522810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.524083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.179 [2024-07-12 16:09:59.525545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.525826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.525838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.528032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.528410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.528808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.529514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.531431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.533011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.534484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.535339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.535671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.535682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.538091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.538697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.540100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.541688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.543428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.544273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.545596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.547194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.547466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.547476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.550824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.552107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.553581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.555044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.556572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.557852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.559316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.560782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.561142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.561153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.564777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.566248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.567715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.568569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.570346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.571816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.573281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.574057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.574540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.574552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.578048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.579644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.580650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.581926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.583719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.585311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.585686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.586067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.586559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.586569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.589393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.590675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.592141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.593606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.594260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.594635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.595025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.595717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.596049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.596063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.599274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.600743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.601454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.601832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.602544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.603637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.604917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.606378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.606650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.606660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.608753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.609130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.609505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.610081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.611998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.613598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.615059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.615902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.616265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.180 [2024-07-12 16:09:59.616275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.181 [2024-07-12 16:09:59.618652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.181 [2024-07-12 16:09:59.619156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.620618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.622211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.623944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.624793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.626127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.627720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.627994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.628007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.631400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.632679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.634144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.635616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.637238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.638512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.639977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.641440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.641763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.641773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.645294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.646764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.648229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.649449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.651037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.652504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.653973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.655281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.655755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.655766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.659223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.660691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.661652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.663237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.664976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.666452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.667222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.667596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.667944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.667954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.670692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.672285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.673622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.675093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.676327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.676720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.677094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.444 [2024-07-12 16:09:59.677471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.677767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.677777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.680965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.682559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.684156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.684533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.685326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.685701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.687148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.688432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.688705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.688718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.691985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.692363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.692738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.693114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.695104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.696454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.697854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.698745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.699031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.699041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.701375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.701758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.702132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.702520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.703247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.703625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.704005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.704388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.704766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.704777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.707359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.707738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.708124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.708499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.709332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.709708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.710086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.710469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.710980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.710991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.713481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.713860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.714235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.714613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.715337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.715716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.716091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.716464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.716943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.716953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.719534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.719918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.720294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.720668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.721447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.721825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.722199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.722573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.723081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.723091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.725672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.726063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.726440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.726817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.727583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.727962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.728337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.728717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.729091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.729102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.731567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.731947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.732329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.732704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.733557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.733943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.734320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.734699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.735182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.735193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.737816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.738196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.738577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.738956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.739722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.740098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.740472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.740859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.741240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.741251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.743785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.744163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.744539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.744917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.745804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.746181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.445 [2024-07-12 16:09:59.746555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.746947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.747339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.747350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.749815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.750194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.750570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.750599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.751341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.751722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.752095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.752470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.752899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.752909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.755439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.755819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.756198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.756573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.756610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.757028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.757409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.757788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.758164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.758541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.759001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.759012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.761887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.762257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.762267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.764456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.764495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.764531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.764569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.765754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.767802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.767841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.767876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.767911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.768853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.770737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.770775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.770810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.770846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.771856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.773616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.773654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.773690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.773728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.774669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.776582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.776620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.776669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.776705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.777664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.446 [2024-07-12 16:09:59.779864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.780131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.780141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.782984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.783019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.783287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.783297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.784921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.784958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.784994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.785968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.787840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.787880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.787916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.787952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.788766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.790549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.790589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.790625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.790664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.791551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.793997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.794007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.795989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.796822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.798949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.799444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.799454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.801985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.802310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.802321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.803925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.803964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.804005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.804041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.804430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.804471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.447 [2024-07-12 16:09:59.804507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.804542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.804578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.805043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.805056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.806687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.806731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.806767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.806803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.807536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.809653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.809691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.809732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.809769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.810641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.812919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.813238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.813248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.815973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.816010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.816045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.816313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.816323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.817985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.818696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.819066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.819077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.821758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.822024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.822035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.823860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.823898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.823936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.823972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.824952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.826536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.448 [2024-07-12 16:09:59.826575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.826613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.826649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.826986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.827432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.829992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.830027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.830297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.830308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.831961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.832984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.834871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.834910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.834947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.834983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.835898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.837504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.837543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.837582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.837618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.837992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.838622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.840774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.841044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.841054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.843951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.845595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.845633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.847652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.848142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.848153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.850078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.850117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.850153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.851745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.852477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.854549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.854931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.856174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.449 [2024-07-12 16:09:59.857450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.857726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.859201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.860373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.861836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.863115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.863385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.863395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.865766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.867267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.868824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.870317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.870589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.871449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.872834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.874432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.876049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.876320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.876330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.880130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.881672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.883121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.884593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.885096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.450 [2024-07-12 16:09:59.886690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.888075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.889532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.891003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.891454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.891464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.895005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.896471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.897939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.899034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.899306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.900619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.902082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.903543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.904431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.904940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.904953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.908250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.909724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.910582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.911935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.912206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.913804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.915316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.915802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.916177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.916641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.916651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.919258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.920851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.922276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.923746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.924018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.924780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.925156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.925530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.925907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.926178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.926188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.929409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.931008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.932505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.933014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.933426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.933807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.934183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.935687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.937011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.937283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.937293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.940399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.940795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.941171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.941546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.941931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.943523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.944925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.946385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.947850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.948201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.948211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.950163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.950541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.950923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.952498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.952774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.954244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.955707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.956557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.957895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.958170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.958180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.960643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.961950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.963227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.964675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.964953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.966139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.967599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.968885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.970336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.970605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.970615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.974195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.975482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.976940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.978406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.978776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.980216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.981491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.982952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.984414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.984759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.712 [2024-07-12 16:09:59.984769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.988302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.989771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.991232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.992753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.993160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.994437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.995904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.997433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.999033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.999489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:09:59.999499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.003342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.004806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.006400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.007423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.007746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.009211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.010654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.011832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.012209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.012618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.012629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.015214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.016728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.018231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.019692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.020184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.020563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.020943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.021317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.021693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.022070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.022081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.024585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.024968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.025344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.025721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.026094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.026473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.026852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.027228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.027602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.028034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.028045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.030703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.031094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.031471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.031855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.032237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.032617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.033000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.033375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.033757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.034142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.034152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.036676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.037068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.037443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.037822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.038255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.038635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.039019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.039393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.039770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.040207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.040219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.042640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.043027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.043402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.043781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.044159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.044540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.044921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.045297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.045672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.046075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.046087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.048874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.049265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.049641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.050020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.050515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.050901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.051277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.051652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.052031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.052546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.052557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.054959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.055341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.055720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.056096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.056439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.056822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.057198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.057573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.057955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.058309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.713 [2024-07-12 16:10:00.058324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.061041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.061418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.061796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.062173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.062563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.062947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.063323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.063698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.064076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.064466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.064477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.066995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.067373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.067753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.068129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.068562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.068946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.069321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.069698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.070079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.070516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.070529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.073101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.073479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.073857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.074236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.074595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.074979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.075355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.075734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.076119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.076486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.076496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.078964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.079344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.079724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.080100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.080581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.081048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.081427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.081806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.082185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.082667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.082677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.085169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.085547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.085927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.086652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.086967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.087909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.088956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.089338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.089717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.090130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.090141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.092829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.093966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.095252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.096717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.096993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.098238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.099637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.100911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.102371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.102645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.102656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.106428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.107799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.109258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.110727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.111134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.112660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.113941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.115400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.116870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.117205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.117215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.120783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.122253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.123704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.125127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.125474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.126760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.128224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.129692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.131122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.131552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.131563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.134950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.136419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.137955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.139061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.139368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.140840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.142304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.143789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.144167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.144540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.144551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.714 [2024-07-12 16:10:00.147902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.149139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.150534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.151810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.152086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.153562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.154622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.155000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.155375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.155812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.715 [2024-07-12 16:10:00.155824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.158786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.160064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.160103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.161573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.161852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.162747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.163122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.163497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.163875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.164172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.164183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.167385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.168930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.170392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.170431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.170810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.171190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.171565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.171945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.173555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.173878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.173890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.175995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.176469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.176479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.178990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.179026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.179297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.179307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.180984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.181707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.182064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.182074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.183773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.183812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.183848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.183884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.184858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.186768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.186807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.186843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.186885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.187844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.189996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.977 [2024-07-12 16:10:00.190032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.190300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.190309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.192988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.193024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.193291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.193301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.194922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.194968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.195526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.196045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.196056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.197922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.197965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.198883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.200523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.200562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.200598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.200633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.201706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.203866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.204133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.204143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.206765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.207112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.207121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.208827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.208866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.208901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.208937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.209702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.212860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.214525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.214564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.214602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.214638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.978 [2024-07-12 16:10:00.215650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.215660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.217625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.217664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.217703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.217742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.218447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.220827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.221204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.221214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.222830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.222870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.222905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.222941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.223666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.225641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.225680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.225719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.225755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.226484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.228632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.229144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.229155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.231968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.233586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.233626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.233662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.233701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.234736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.236992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.237028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.237355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.237364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.979 [2024-07-12 16:10:00.239968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.240238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.240249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.241914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.241953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.241988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.242724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.244922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.244961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.244996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.245788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.247978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.248013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.248049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.248469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.248480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.250901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.251221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.251231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.252972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.253995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.255620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.255659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.255695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.255734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.256469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.258432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.258471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.259837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.260105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.260115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.261740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.261780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.261816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.262851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.263883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.267250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.268312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.269591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.271057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.271337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.272762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.273139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.273513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.273888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.980 [2024-07-12 16:10:00.274231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.274241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.277117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.278570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.280040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.281398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.281810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.282190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.282565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.283100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.284576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.284857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.284868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.287956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.289173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.289548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.289924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.290404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.291175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.292451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.293951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.295547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.295826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.295837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.297837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.298215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.298594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.299605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.299913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.301308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.302843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.303926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.305202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.305477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.305487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.308936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.310212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.311696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.313162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.313496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.314566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.315847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.317328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.318796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.319071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.319081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.321701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.322083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.322457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.322835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.323242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.323621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.324001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.324380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.324760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.325124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.325138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.327681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.328066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.328452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.328830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.329234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.329613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.329994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.330369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.330746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.331208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.331218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.333683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.334068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.334444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.334822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.335199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.335578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.335957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.336331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.336712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.337092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.337101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.339684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.340067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.340442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.340822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.981 [2024-07-12 16:10:00.341284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.341663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.342043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.342417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.342800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.343314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.343325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.345786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.346165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.346538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.346919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.347349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.347734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.348110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.348486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.348863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.349218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.349228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.351882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.352260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.352634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.353012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.353387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.353770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.354146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.354520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.354898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.355278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.355288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.357776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.358157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.358532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.358911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.359365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.359753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.360130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.360503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.360886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.361365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.361376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.363877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.364256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.364630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.365008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.365399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.365792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.366167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.366541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.366919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.367289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.367302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.370116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.370507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.370885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.371261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.371751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.372130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.372504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.372881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.373263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.373758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.373770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.376208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.376588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.376977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.377356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.377746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.378124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.378498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.378874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.379250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.379767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.379777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.382413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.382797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.383175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.383551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.383924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.384305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.384681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.385060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.385438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.385827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.385839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.388100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.388481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.388862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.389253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.389719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.390099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.390474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.391627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.392910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.393180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.393190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.396262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.397048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.397424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.397801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.982 [2024-07-12 16:10:00.398179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.399336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.400611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.402075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.403539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.403815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.403826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.405811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.406189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.406564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.407923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.408222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.409697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.411164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.412317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.413782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.414090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.414099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.416318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.417006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.418314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.419901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:39.983 [2024-07-12 16:10:00.420176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.421790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.422633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.423909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.425448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.425727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.425738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.428844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.430124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.431587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.433080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.433414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.434669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.435945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.437423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.438885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.439196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.439206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.442838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.246 [2024-07-12 16:10:00.444299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.445892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.447487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.447933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.449368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.450956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.452504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.453961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.454337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.454347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.458019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.459499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.460961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.461809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.462128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.463731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.465319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.466790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.467305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.467760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.467774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.471184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.472738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.473678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.474967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.475241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.476715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.478248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.478623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.479003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.479488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.479500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.482046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.483327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.484788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.486263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.486535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.486916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.487292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.487677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.488437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.488741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.488752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.491879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.493344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.494671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.495049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.495459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.495841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.496411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.497832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.499424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.499696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.499706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.502511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.502892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.503268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.503642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.503969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.505250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.506771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.508359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.509947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.510443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.510452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.512500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.512884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.513408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.514822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.515095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.516682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.518181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.519028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.520310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.520581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.520591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.522935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.524400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.525681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.527146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.527420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.528517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.530047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.531382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.532829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.533100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.533111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.536662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.537945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.539416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.247 [2024-07-12 16:10:00.540880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.541199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.542568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.543844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.545308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.546774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.547085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.547095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.550618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.552077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.552115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.553582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.553913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.555287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.556564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.558016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.559477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.559804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.559817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.563324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.564789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.566241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.566279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.566601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.567944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.569222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.570683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.572154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.572506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.572516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.574866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.574905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.574941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.574977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.575672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.577984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.578021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.578067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.578476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.578488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.580711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.580750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.580801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.580837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.581561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.583997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.584033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.584069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.584106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.584482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.584492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.586757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.587026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.248 [2024-07-12 16:10:00.587037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.589900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.591588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.591627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.591662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.591703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.591975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.592629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.594982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.595018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.595054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.595090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.595479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.595490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.597830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.598351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.598362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.600812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.602816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.602858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.602894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.602929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.603782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.605974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.606010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.606046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.606363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.606372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.608623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.249 [2024-07-12 16:10:00.608662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.608698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.608737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.609457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.611835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.612224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.612234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.614974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.615015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.615283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.615293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.617849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.618236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.618246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.619873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.619912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.619948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.619983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.620749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.622754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.622794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.622830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.622866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.623565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.625785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.626292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.626302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.628743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.629138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.629148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.630808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.630847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.630882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.630918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.250 [2024-07-12 16:10:00.631983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.633621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.633659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.633699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.633737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.634438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.636995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.637032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.637067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.637104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.637370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.637379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.638958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.638996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.639628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.640030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.640044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.641971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.642966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.644676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.644717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.644755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.644791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.645825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.251 [2024-07-12 16:10:00.648539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.648575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.648611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.648651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.649057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.649067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.651964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.652000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.652036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.652507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.652517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.654564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.654602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.654638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.654686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.655665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.657755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.657793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.657830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.657866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.658803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.660921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.660959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.661998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.662034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.662500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.662510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.664582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.664621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.664657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.665958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.668526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.668908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.669283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.669662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.670021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.670400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.670778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.671151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.671538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.671913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.671923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.674454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.674835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.675210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.675584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.676054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.676435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.676815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.677189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.677563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.677979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.677989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.680405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.680787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.681161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.681534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.681928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.682954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.683951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.684325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.685681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.686117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.686128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.688571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.252 [2024-07-12 16:10:00.690079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.690594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.690973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.691368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.692006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.693386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.693765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.694741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.695085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.695095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.697517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.698770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.699536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.699912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.700369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.700752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.702330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.702704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.703412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.515 [2024-07-12 16:10:00.703683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.703693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.705990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.707074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.708010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.708385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.708726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.709104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.710688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.711140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.711674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.711961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.711971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.714339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.715350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.716359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.716736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.717121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.717498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.719014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.719526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.719989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.720266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.720276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.722680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.723664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.724703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.725085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.725509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.725892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.727399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.727919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.728402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.728678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.728688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.731095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.732057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.733089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.733463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.733898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.734276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.735759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.736280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.736737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.737010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.737020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.739469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.740372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.741476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.741853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.742195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.743524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.744404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.745480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.746936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.747380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.747390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.749746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.750141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.750516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.750892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.751250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.751627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.752006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.752381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.752762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.753081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.753090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.756186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.757655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.759116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.760014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.760479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.760861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.761237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.762160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.763435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.763705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.763720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.766837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.767605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.767982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.768357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.768747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.769844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.771137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.772601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.774069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.774397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.774408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.776385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.776766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.777141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.778557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.778862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.780337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.781802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.783033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.784427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.784786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.516 [2024-07-12 16:10:00.784796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.787044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.787823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.789108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.790647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.790923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.792515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.793406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.794677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.796176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.796445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.796455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.799580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.800862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.802326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.803786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.804055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.805178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.806454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.807906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.809369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.809669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.809680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.813198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.814664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.816156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.817744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.818219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.819520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.821071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.822661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.824184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.824561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.824571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.828155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.829624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.831090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.831950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.832319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.833924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.835461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.836794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.837517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.838030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.838040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.841468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.842953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.843840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.845122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.845393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.846887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.848478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.848856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.849231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.849661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.849672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.852206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.853514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.855103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.856716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.856986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.857497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.857877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.858251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.858632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.858906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.858917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.862112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.863577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.865132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.865507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.865878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.866257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.866631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.868216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.869572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.869845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.869863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.873048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.873428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.873806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.874180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.874561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.876156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.877450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.878907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.880364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.880777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.880787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.882775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.883153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.883528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.885116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.885446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.886921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.888392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.889390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.890983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.517 [2024-07-12 16:10:00.891302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.891312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.893605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.894452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.895724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.897255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.897526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.899119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.900015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.901290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.902753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.903026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.903035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.906287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.907568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.909023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.910489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.910805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.912022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.913298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.914756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.916222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.916505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.916515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.920024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.921495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.922953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.924360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.924696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.925981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.927446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.928905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.930494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.930892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.930902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.934309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.935772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.937348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.938367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.938698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.940162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.941620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.943211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.943585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.943965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.943975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.947408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.948510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.950045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.951321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.951595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.953071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.954001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.954380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.954759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.955157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.955169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.958335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.518 [2024-07-12 16:10:00.959735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.961187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.962654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.963114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.963495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.963874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.964248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.965175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.965527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.965537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.968664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.970131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.971074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.971459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.971821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.972201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.973240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.974513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.975976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.976246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.976256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.978579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.978959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.978997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.979368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.979758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.981192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.982475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.983934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.985395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.985746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.985757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.987784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.988164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.988545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.988583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.988940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.990531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.992022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.993484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.994344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.994669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.994679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.996644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.996683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.996722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.996765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.997759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:00.999981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.000248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.000258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.002674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.002715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.002751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.002787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.781 [2024-07-12 16:10:01.003579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.005901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.006323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.006333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.008788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.009117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.009126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.010940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.010978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.011614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.012010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.012022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.013673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.013714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.013750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.013785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.014566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.016650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.016689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.016735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.016772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.017501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.019789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.020282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.020292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.022918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.023217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.023228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.025968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.027623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.027662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.027701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.027739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.782 [2024-07-12 16:10:01.028468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.030999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.031039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.031307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.031317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.032960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.032998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.033929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.036802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.037145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.037156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.039978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.040018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.040057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.040093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.040500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.040509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.043418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.043459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.043496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.043533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.044622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.047523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.047565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.047612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.047649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.048692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.051858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.051899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.051935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.051974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.052987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.055931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.055971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.056590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.057072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.057085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.060782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.061202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.783 [2024-07-12 16:10:01.061214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.064810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.065248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.065258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.068829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.069232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.069242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.072928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.073304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.073314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.076973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.077428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.077438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.080453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.080507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.080543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.080579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.081605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.084542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.084584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.084621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.084656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.085756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.088749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.088796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.089405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.089443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.089479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.089940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:40.784 [2024-07-12 16:10:01.180782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.181096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.184844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.186428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.186478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.187221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.188564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.188797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.188849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.190541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.190584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.191745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.191788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.192070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.192372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.192679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.192695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.192708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.196219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.197999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.198640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.200186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.200411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.202081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.203814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.204129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.204434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.784 [2024-07-12 16:10:01.204741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.204756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.204768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.207543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.209285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.210933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.212029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.212308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.214096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.215874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.216643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.216951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.217272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.217288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.217302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.220221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.221997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.223778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.224479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:40.785 [2024-07-12 16:10:01.224714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.226363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.228137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.229371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.229675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.230046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.230061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.230075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.232958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.234730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.236487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.236985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.237213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.238596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.240372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.242014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.242318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.242730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.242746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.242760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.245752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.247522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.249305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.250115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.250341] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.251727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.253414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.255096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.255401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.255908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.255922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.255937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.259171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.260923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.262684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.263638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.263873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.265235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.267010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.268785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.269107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.269596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.047 [2024-07-12 16:10:01.269611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.269624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.272837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.274601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.276384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.277304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.277532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.278814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.280587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.282199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.282663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.283179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.283195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.283208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.286441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.288200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.289971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.290911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.291137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.292512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.294292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.295919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.296257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.296748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.296766] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.296781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.299919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.301618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.303356] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.304332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.304560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.305934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.307619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.309387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.309749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.310228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.310242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.310257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.313382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.315172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.316877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.317939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.318195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.319558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.321317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.323080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.323421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.323924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.323939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.323954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.327045] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.328826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.330606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.331601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.331838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.333185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.334962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.336737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.337050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.337549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.337565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.337578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.340661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.342199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.343848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.345076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.345399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.346769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.348620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.350370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.350684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.351191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.351206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.351221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.354293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.356067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.357789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.358668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.358898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.360218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.361921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.363583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.363976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.364475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.364490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.364508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.367683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.369281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.370909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.372266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.372561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.373932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.375682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.377459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.377948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.378438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.378453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.378465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.048 [2024-07-12 16:10:01.381724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.383486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.385190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.386179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.386416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.387807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.389571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.391350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.391658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.392142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.392159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.392172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.395244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.397012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.397049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.398817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.399513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.399747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.399763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.401142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.402916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.402952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.404589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.404900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.405303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.405318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.405332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.405347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.408237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.408275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.410046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.410082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.410307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.410321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.411429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.411466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.413407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.415900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.416205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.416245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.417891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.417929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.418152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.418166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.418178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.418190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.420838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.420875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.422607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.422644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.422873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.422887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.423726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.423762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.424470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.427399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.427437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.429164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.429200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.429426] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.429440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.430001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.430037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431676] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431689] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.431717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.433549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.433589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.433898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.433934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.434238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.434253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.434851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.434887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.436490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.439585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.049 [2024-07-12 16:10:01.439622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.441236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.441273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.441679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.441693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.442922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.444813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.444851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.445978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.446963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.447002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.447310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.447327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.447342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.447355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.450380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.450419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.450965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.451001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.451226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.451248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.452824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.452876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.453692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.455681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.455725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.456295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.456330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.456562] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.456575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.458176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.458221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.458944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.458980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.459234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.459248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.459261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.459274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.461504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.461542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.461851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.461888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.462176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.462189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.463511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.463547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.464936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.464973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.465236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.465251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.465263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.465275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.467895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.468553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.468588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.469916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.471655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.471692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.471999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.472978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.473014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.473241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.473254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.473266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.473278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.476129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.476173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.050 [2024-07-12 16:10:01.476475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.476511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.476874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.476892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.477958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.480661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.480699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.482832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.483725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.486429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.486467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.487655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.487700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.488777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.489196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.489214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.489228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.489242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.491174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.491213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.491249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.051 [2024-07-12 16:10:01.491284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.491586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.491604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.491915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.491953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.491989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.492025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.492335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.492350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.492362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.492376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494075] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.494986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.495000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.495015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.496683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.496726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.496765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.496802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.314 [2024-07-12 16:10:01.497730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.497744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.497757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.497769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.499569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.499614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.499649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.499683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.500618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.502510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.502550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.502588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.502622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503142] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.503536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.505831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.506265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.506281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.506295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.506309] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.508675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.509012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.509026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.509039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.509052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.511983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.512011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.512023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.512035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.513904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.513943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.513980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.514946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.516770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.516808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.516846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.516881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.315 [2024-07-12 16:10:01.517775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.517789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.517801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.517814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.519555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.519601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.519636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.519670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.520613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.522972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.523461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.525991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.526403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.526418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.526432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.526446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528665] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.528988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.529002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.529015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.529027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.530716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.530753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.530787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.530836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.531864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.533748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.533797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.533835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.533871] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534232] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534708] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.534739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.536385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.536691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.536733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.536774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537515] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.316 [2024-07-12 16:10:01.537840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.537856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.537868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.537891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.539778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540065] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.540817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.541792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.543744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.544774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.545967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.547675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.547965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.548023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.548838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.548892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.549869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:41.317 [2024-07-12 16:10:01.642929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.644524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.650580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.652076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.652127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.653445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.653496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.654934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.654985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.656479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.656872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.656889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.656905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.656920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.664582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.665439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.666750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.667029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.667046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.667062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.670047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.670807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.671231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.672814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.674546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.676023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.676889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.678172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.678452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.678469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.678485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.680772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.682347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.683724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.685191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.686343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.687954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.689452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.690924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.691203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.691220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.691235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.693505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.694533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.695816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.697290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.699081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.700208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.701503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.702975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.703253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.703269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.703289] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.706910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.708181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.709658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.317 [2024-07-12 16:10:01.711131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.713016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.714344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.715817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.717288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.717687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.717704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.717724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.720179] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.721563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.723162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.724771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.725894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.727170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.728666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.730261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.730540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.730561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.730577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.734300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.735770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.737266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.738864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.740510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.742114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.743715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.745270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.745653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.745670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.745685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.748526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.749817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.751287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.752754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.754228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.755514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.756968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.318 [2024-07-12 16:10:01.758435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.758776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.758795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.758819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.762343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.763822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.765291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.766489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.768036] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.769512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.770980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.772105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.772382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.772399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.772414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.776146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.777436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.778910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.780375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.782208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.579 [2024-07-12 16:10:01.783486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.784947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.786413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.786778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.786807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.786822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.790448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.791920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.793390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.794255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.796084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.797662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.799151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.799931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.800265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.800281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.800297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.803681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.805165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.806671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.808271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.809926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.811471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.813071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.814682] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.815161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.815189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.815205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.818723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.820272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.821129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.822336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.824107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.825189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.826100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.826480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.826761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.826778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.826796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.830353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.831769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.832212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.832594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.833449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.834970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.836094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.837039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.837329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.837346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.837361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.840929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.841316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.842018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.843407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.844848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.846055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.847146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.847529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.847916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.847935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.847951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.851243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.852843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.853760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.853805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.854594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.855999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.856583] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.856973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.857250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.857267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.857283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.859377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.859435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.859821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.859868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.861828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.861882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.863477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.863523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.863933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.863950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.863965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.867011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.867058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.867509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.867556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.869503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.869569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.871167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.871224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.871658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.871678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.871693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.873993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.874040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.875471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.875516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.876390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.580 [2024-07-12 16:10:01.876435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.877830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.877875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.878198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.878215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.878237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.880421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.880467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.882066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.882117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.883088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.883134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.884532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.884577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.885021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.885039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.885054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.888686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.888737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.889471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.889517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.891380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.891430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.892453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.892501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.892932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.892949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.892965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.896257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.896304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.897218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.897263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.898968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.899867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.903085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.903131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.904527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.904572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.906521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.906580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.906967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.907013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.907337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.907354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.907369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.910316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.910363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.911748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.911793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.912721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.912769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.913148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.913193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.913578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.913596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.913614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.915990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.916037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.917238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.917283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.918677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.918726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.919455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.919501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.919896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.919913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.919937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.921910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.921958] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.922339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.922396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.923246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.923293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.923673] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.923722] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.924002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.924019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.924034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.926568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.926615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.927009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.927054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.927817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.927865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.928245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.928302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.928727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.928744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.928761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.930952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.931002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.931384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.581 [2024-07-12 16:10:01.931428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.932287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.932334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.932727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.932773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.933161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.933186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.933201] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.936909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.936956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.937337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.937382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938151] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.938991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.939017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.941388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.941436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.942578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.942623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.943456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.943528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.943912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.943960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.944459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.944477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.944495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.947408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.947454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.948264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.948310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950112] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.950988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.953518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.953565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.953607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.953640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.954089] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.955656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.955701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.956088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.956438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.956455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.956470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.958569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.958615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.958658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.958702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.959745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.961700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.961754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.961801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.961844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.962785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.965768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.966162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.966178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.966193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.968870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.969244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.969261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.969276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.971645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.971691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.971737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.971780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.582 [2024-07-12 16:10:01.972249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.972719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.974693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.974746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.974789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.974835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.975774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.977466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.977513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.977559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.977604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.978592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.980841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.981117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.981136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.981164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.983996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.984288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.984305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.984320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.986890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.987163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.987180] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.987195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.988870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.988922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.988965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.989997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.990020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.990037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.991874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.991918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.991961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.992915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.995683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.583 [2024-07-12 16:10:01.996054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.996071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.996086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.997782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.997829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.997873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.997917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:01.998915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.000740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.000785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.000827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.000869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001759] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.001775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.003915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.003960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.005358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.005403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.005875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.005920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.005962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.006005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.006278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.006295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.006310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.009395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.009441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.009827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.009886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.010199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.011731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.011777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.012566] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.012846] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.012863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.012878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.015493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.015543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.015927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.015971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.016276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.017727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.017773] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.018868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.019143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.019160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.019175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.021526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.021577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.022200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.022245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.022549] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.023679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.584 [2024-07-12 16:10:02.023728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.025149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.025429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.025446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.025461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.027747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.027796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.028729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.028774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.029080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.029908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.029953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.031552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.031836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.031852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.031867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.034046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.034095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.035225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.035269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.035574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.036189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.036233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.037621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.037904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.037920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.037936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.040129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.040207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.041697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.041745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.042051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.042737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.042781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.044164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.846 [2024-07-12 16:10:02.044492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.044508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.044533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.046700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.046751] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.048354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.048402] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.048720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.049586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.049631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.051017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.051393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.051413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.051429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.053731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.053777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.055051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.055095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.055399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.056874] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.056919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.057955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.058231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.058248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.058274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.064568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.064618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.065005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.065050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.065417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.066701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.066749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.068220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.068502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.068519] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.068535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.070963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.071009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.072114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.072159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.072568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.073827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.073872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.074355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.074727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.074743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.074758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.080271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.080320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.081396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.081442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.081867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.082813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.082860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.083240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.083517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.083548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.083579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.087007] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.087055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.088113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.088157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.088542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.090022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.090071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.091528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.091888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.091906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.091921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.098490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.098539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.100009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.100061] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.100367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.101224] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.101268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.102553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.102835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.102852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.102868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.105926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.105972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.106357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.106403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.106797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.108080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.108124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.109596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.109878] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.109895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.109910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.114302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.847 [2024-07-12 16:10:02.114353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.115331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.115372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.115857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.116239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.116283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.117693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.117975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.117993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.118009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.119582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.121059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.122022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.122070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.123657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.124167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.124574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.124618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.125944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.125989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.126419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.126438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.126456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.132386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.133853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.135171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.136407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.136864] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.137253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.138717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.139254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.139877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.140231] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.140251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.140266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.143438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.144912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.146214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.147481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.147940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.148329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.149809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.150323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.151005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.151325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.151342] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.151357] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.157452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.158544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.159451] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.159837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.160116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.160501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.161396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.162683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.164152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.164432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.164448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.164463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.166932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.168526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.168915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.169732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.170115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.170500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.171867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.173146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.174623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.174908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.174925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.174940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.179230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.180242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.181217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.181601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.181882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.183301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.184771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.186237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.187095] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.187397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.187414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.187429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.189548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.190937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.191548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.192132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.192488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.194092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.195677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.197169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.198028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.198332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.198348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.848 [2024-07-12 16:10:02.198367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.205039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.205427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.206552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.207839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.208118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.209590] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.210828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.212253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.213534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.213815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.213832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.213847] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.217350] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.217743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.218597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.219881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.220157] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.221556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.222742] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.224028] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.225505] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.225865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.225882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.225900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.232565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.234054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.235550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.236654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.236959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.238447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.239659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.240915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.241649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.242042] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.242062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.242081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.244619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.245994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.247593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.248326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.248657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.249046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.250253] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.251038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.251421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.251697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.251718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.251734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.256972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.258188] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.258574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.259680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.259964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.260789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.262387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.263771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.264447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.264735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.264757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.264772] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.268381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.269986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.270821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.272102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.272392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.273413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.274381] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.274770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.276359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.276877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.276896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.276915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.281464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.281855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.283023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.283852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.284251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.285361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.286750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.287827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.289149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.289429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.289445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:41.849 [2024-07-12 16:10:02.289460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.291760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.292470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.293866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.295304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.295723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.297121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.298629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.299684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.300627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.301035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.301057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.301074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.307346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.308732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.309926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.310730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.311132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.312407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.313128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.313509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.315098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.315375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.315391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.315406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.317681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.317732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.319015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.319059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.319540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.319931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.321399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.322775] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.323412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.323687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.323704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.323723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.328403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.328456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.329850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.329894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.330351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.331771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.331816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.333277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.333321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.333820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.333837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.333852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.336492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.336539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.337932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.337976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.338427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.339658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.339702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.341096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.341140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.341608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.341625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.341640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.346983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.347032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.347949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.347993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.348269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.349196] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.349241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.350828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.350879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.351344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.351362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.351382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.354764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.354810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.355556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.355599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.113 [2024-07-12 16:10:02.355879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.357052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.357097] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.358663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.358707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.359219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.359236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.359256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.363536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.363584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.364559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.364605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.365003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.366275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.366319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.366797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.366843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.367247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.367263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.367278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.371071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.371130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.371516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.371561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.371932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.372316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.372360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.373854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.373897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.374371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.374389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.374408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.377569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.377619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.378968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.379013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.379497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.379887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.379932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.380332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.380391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.380712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.380728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.380743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.383226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.383272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.383793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.383837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.384164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.384548] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.384594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.385939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.385983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.386470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.386488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.386506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.389866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.389919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.390299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.390348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.390803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.392368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.392412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.392796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.392843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.393173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.393189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.393204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.396724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.396770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.397155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.397202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.397484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398966] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.398998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.404415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.404464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.405517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.405565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.406010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.406397] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.406453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.408048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.408091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.408595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.408614] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.408632] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.410892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.410938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.412275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.412327] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.412745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.114 [2024-07-12 16:10:02.413128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.413934] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.417674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.417726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.418106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.418158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.418436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.418823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.418870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.419249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.419293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.419760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.419781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.419797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.422039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.422085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.422478] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.422528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.422868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.423885] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.423932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.424311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.424355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.424629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.424658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.424685] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.428740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.428791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.430052] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.430765] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.431155] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.432287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.432332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.432931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.432985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.433261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.433281] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.433297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435148] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.435983] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.437575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.437620] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.439221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.439693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.439712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.439727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443658] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443700] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.443976] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.444000] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.444015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.445821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.445866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.445908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.445950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.446845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.450648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.450695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.450740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.450783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451434] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451801] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.451816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.453601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.453645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.453691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.453735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.454124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.454170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.454229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.454272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.115 [2024-07-12 16:10:02.454315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.454591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.454608] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.454623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.458959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.459002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.459275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.459292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.459308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.461817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.462091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.462111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.462138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.464809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.464870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.464914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.464956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.465834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.467695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.467743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.467789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.467833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468456] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.468806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473296] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.473747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.474293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.474310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.474325] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.476758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.477150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.477169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.477189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.481553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.481600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.481654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.481697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.482671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.484804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.484850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.484901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.484943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.116 [2024-07-12 16:10:02.485831] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490687] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.490877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.491150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.491166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.491181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.493979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.494401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.494417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.494432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499896] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.499986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.500389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.500406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.500421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.502211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.503612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.503656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.504880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.505278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.505297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.505314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.510467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.511868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.511912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.512733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.513009] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.513058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.513442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.513489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.514726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.515175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.515191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.515207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.516978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.518504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.518552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.520166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.520443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.520497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.521492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.521536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.522278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.522690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.522707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.522728] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.527692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.528596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.528640] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.530058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.530337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.530383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.117 [2024-07-12 16:10:02.531987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.532032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.533501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.533937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.533954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.533968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.536109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.537444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.537488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.538830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.539106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.539152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.540623] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.540666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.541557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.541904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.541920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.541935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.546951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.547950] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.547997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.548377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.548656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.548703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.550304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.550348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.551954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.552233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.552249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.552274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.553932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.555429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.555473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.555857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.556234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.118 [2024-07-12 16:10:02.556280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.379 [2024-07-12 16:10:02.557586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.557631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.558019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.558305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.558321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.558336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.562684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.564291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.564339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.565305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.565675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.565726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.566110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.566169] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.567758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.568203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.568221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.568238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.572080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.573556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.573600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.575078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.575441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.575488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.576828] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.576872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.577277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.577648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.577664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.577679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.582229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.583204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.583248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.584535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.584815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.584863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.586330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.586374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.587106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.587392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.587414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.587430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.590252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.591733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.591777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.592618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.592937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.592984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.594459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.594503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.595970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.596315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.596333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.596348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.601784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.603064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.603108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.604578] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.604860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.604907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.605948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.606005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.607603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.607892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.607909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.607924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.612131] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.613419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.613463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.614938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.615217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.615266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.616191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.616235] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.617650] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.617930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.617947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.617962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.621816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.623114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.623158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.624625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.624906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.624953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.625812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.625856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.627138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.627414] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.627430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.627445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.631234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.632732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.632777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.634376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.634651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.634702] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.380 [2024-07-12 16:10:02.635769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.635813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.637098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.637375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.637392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.637411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.641284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.642890] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.642939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.642982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.643258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.643304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.644526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.644570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.646189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.646502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.646518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.646533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.652019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.653536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.653580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.655177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.655457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.656322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.656367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.657652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.657696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.657975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.657992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.658008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.663311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.664789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.666259] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.667110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.667427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.669029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.670545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.672014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.672489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.672937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.672955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.672974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.678054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.679394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.680872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.682343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.682714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.683108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.683488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.683873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.685270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.685596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.685612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.685627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.690770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.691159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.692225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.693428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.693740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.694971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.696178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.697330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.697715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.698102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.698118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.698133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.701331] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.701731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.702116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.702960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.703240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.704535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.705915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.707117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.707945] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.708446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.708466] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.708485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.713082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.714559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.716096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.716479] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.716879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.717265] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.717646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.719244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.719816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.720093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.720111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.720126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.724527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.725930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.726398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.726782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.727237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.727624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.728978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.730373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.731197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.381 [2024-07-12 16:10:02.731475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.731491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.731506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.737098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.737797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.739197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.740692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.741230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.741617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.742005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.742386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.743987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.744266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.744283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.744298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.748575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.749980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.750789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.752387] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.752663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.753243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.753626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.754012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.754396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.754674] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.754691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.754706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.759771] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.760729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.762126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.763314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.763622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.765013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.766099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.766482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.766865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.767298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.767316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.767333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.771941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.772329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.772806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.774264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.774544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.775378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.776767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.778133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.778516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.778906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.778925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.778942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.782288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.782672] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.783059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.783440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.783830] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.784215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.784597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.784981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.785376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.785769] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.785788] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.785803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.788942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.789333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.789723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.790126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.790593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.790981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.791363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.791747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.792139] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.792615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.792637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.792656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.795863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.796249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.796631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.797017] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.797485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.797873] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.798256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.798647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.799032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.799492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.799509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.799525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.802770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.803160] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.803543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.803944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.804320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.804705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.805090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.805470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.805859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.806267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.806284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.806300] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.382 [2024-07-12 16:10:02.809560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.809951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.810335] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.810729] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.811083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.811469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.811853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.812236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.812616] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.813040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.813058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.813073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.816307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.816354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.816737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.816781] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.817087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.817850] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.819083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.819465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.819854] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.820369] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.820388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.820404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.824428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.383 [2024-07-12 16:10:02.824477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.824863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.824910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.825310] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.826908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.826953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.828491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.828541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.828821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.828838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.828853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.833297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.833345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.834056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.834100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.834413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.835866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.835923] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.836305] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.836351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.836746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.836764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.836780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.839953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.840002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.840382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.840425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.840811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.841680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.841726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.843145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.843189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.843464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.843481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.843496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.847153] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.847204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.848228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.848273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.848547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.849413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.849457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.850845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.850889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.851260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.851277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.851292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.855941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.855989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.857537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.857593] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.858072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.858458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.858504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.858889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.858936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.859360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.859383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.859398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.864486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.864537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.864928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.864973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.865248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.866792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.866837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.867545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.867588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.867867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.867884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.867900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.871680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.871731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.873122] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.873166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.873541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.873930] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.873978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.874358] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.645 [2024-07-12 16:10:02.874401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.874816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.874836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.874853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.879629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.879677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.880064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.880111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.880475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.881879] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.881924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.882889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.882933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.883210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.883227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.883242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.888856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.888905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.889836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.889882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.890159] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.891114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.891172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.891552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.891598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.892023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.892040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.892055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.895368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.895418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.895805] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.895852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.896255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.897703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.897750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.899213] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.899257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.899835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.899852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.899866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.904942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.904990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.906469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.906513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.906796] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.907647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.907691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.908993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.909037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.909312] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.909329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.909344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.914375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.914422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.915918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.915961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.916236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.917170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.917214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.918499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.918542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.918820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.918838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.918853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.924345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.924393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.925997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.926041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.926321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.927409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.927457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.928740] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.928784] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.929059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.929076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.929091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.934580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.934636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.936100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.936144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.936500] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.937832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.937877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.939161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.939205] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.939481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.939498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.939514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.945172] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.945221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.946690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.948254] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.948625] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.949918] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.949962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.951429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.951473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.951750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.951768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.646 [2024-07-12 16:10:02.951783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.954887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.954939] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.956415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.956459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.956901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.956947] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.958546] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.958600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.960109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.960385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.960401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.960416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964559] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964836] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.964981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.965023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.965355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.965378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.965393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.970420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.970480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.970526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.970569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971101] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971193] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.971504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.976851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.977302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.977319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.977336] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981152] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981595] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981679] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.981987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.985994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986087] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986129] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986498] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.986954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992746] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992792] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.992881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.993170] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.993187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.993202] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997468] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997953] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.997999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998145] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:02.998642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:03.002351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:03.002399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.647 [2024-07-12 16:10:03.002445] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002832] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.002959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.003233] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.003249] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.003264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007902] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.007944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.008260] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.008278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.008294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.013931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.014204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.014220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.014244] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.019799] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.020230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.020250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.020268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.023862] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.023909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.023954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.023996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024783] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.024819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.028563] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.028610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.028652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.028695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.028978] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.029490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033465] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033829] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033872] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.033914] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.034368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.034385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.034401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038655] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.038841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.039127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.039143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.039167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.043080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.044633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.044678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.648 [2024-07-12 16:10:03.046227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.046720] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.046767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.046810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.046852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.046894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.047195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.047211] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.047226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.049969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.051416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.051460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.052749] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.053125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.053189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.053571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.053618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.054003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.054395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.054413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.054435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.059010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.059396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.059442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.059825] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.060355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.060404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.060856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.060901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.062176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.062526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.062542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.062557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.065901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.066649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.066693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.068080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.068406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.068453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.068838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.068886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.069266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.069645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.069663] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.069680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.074088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.074473] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.074518] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.074901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.075221] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.075271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.076670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.076717] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.077298] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.077574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.077591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.077607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.081258] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.082370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.082415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.083853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.084130] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.084178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.084633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.084677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.085062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.085561] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.085581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.085602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.649 [2024-07-12 16:10:03.089247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.089639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.089697] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.090084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.090495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.090543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.091989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.092032] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.093431] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.093787] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.093804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.093819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.097592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.099164] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.099209] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.099938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.100218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.100264] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.101444] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.101488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.101886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.102262] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.102285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.102301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.104995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.105383] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.105429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.105814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.106225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.106271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.107691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.107739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.109263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.109636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.109652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.109667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.113764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.911 [2024-07-12 16:10:03.115053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.115098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.116220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.116570] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.116618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117068] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117447] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117841] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117859] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.117876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.120785] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.121173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.121218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.121599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.122091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.122140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.122522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.122567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.122951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.123399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.123416] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.123433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.126571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.126963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.127012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.127392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.127931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.127979] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.128374] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.128420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.128803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.129208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.129227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.129245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.132093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.132487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.132534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.132921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.133446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.133494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.133882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.133929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.134317] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.134762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.134780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.134797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.137884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.138272] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.138318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.138699] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.139116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.139174] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.139556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.139603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.139986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.140395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.140415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.140432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.143318] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.143703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.143753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.144141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.144622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.144671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.145056] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.145104] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.145485] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.145973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.145994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.146010] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.149144] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.149530] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.149576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.149619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.150025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.150096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.150476] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.150520] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.150907] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.151372] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.151388] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.151408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.155210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.155925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.155970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.156985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.157301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.158204] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.158273] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.158653] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.158698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.159110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.159128] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.159143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.163216] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.163604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.163994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.164376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.164651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.912 [2024-07-12 16:10:03.166116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.166974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.168239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.169826] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.170293] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.170322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.170338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.175642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.176037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.176417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.176804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.177175] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.178704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.180295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.181059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.182450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.182762] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.182793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.182809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.186389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.187993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.189491] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.190988] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.191413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.191815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.192195] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.192576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.193955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.194239] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.194256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.194271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.198454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.199853] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.200806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.202246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.202522] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.203214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.203596] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.203981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.204364] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.204644] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.204661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.204677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.209803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.210704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.212098] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.213392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.213817] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.215218] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.216542] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.216929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.217324] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.217837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.217857] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.217876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.222044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.222436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.222876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.224407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.224683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.225433] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.226823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.228286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.228669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.229066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.229085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.229102] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.232993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.233382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.233767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.234149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.234428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.236040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.236654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.238051] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.239474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.239893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.239911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.239927] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.243191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.244280] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.245680] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.246737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.247228] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.247612] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.247999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.248752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.250226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.250504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.250524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.250539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.252639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.253026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.253411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.254886] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.255182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.256661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.913 [2024-07-12 16:10:03.258133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.259176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.260782] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.261109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.261125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.261140] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.263437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.264513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.265802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.267274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.267551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.268961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.270206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.271493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.272957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.273236] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.273252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.273267] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.276852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.278141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.279619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.281090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.281495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.282974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.284261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.285732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.287203] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.287577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.287594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.287609] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.291168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.292645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.294111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.294965] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.295248] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.296811] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.298288] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.299750] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.300229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.300641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.300659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.300677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.303961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.305484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.306618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.307906] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.308182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.309661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.310974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.311360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.311743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.312246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.312266] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.312287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.315190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.316469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.317940] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.319409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.319748] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.320134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.320528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.320917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.322090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.322395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.322412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.322428] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.325495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.326967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.327446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.327834] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.328283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.328670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.330277] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.331574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.332932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.333212] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.333229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.333245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.335311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.335361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.335745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.335804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.336276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.337023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.338316] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.339776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.341250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.341536] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.341554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.341582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.343552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.343600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.343985] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.344031] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.344429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.345901] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.345946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.347365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.347410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.347686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.914 [2024-07-12 16:10:03.347704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.347724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.350591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.350638] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.351025] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.351074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.351471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.351863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.351909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.353121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.353165] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.353477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.353494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:42.915 [2024-07-12 16:10:03.353510] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.356643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.356690] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.358199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.358255] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.358701] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.359948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.362822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.362867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.364363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.364407] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.364684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.365072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.365117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.365494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.365551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.366099] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.366119] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.366137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.368758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.368804] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.370083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.370127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.370400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.371881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.371926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.372529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.372573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.373004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.373023] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.373040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.376394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.376440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.377492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.377554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.377833] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.178 [2024-07-12 16:10:03.379370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.379415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.381026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.381070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.381345] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.381362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.381386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.385066] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.385113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.386575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.386619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.386899] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.387984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.388035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.389664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.389713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.389995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.390014] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.390029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.392347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.392398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.393992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.394047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.394326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.395793] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.395837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.396688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.396735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.397053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.397069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.397084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.399311] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.399365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.400981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.401026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.401302] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.402137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.402181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.403376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.403420] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.403741] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.403758] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.403774] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.407194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.407240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.408092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.408136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.408458] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.410973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.414670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.414731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.416163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.416206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.416585] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.416973] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417848] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.417881] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.421117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.421163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.422598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.422643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.423126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.423513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.423556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.423955] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.424002] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.424408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.424424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.424438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.426401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.426449] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.426837] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.426882] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.427278] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.428171] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.428215] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.429059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.429103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.429376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.429393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.429408] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.433225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.433271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.434888] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.434932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.435352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.436912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.436957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.438553] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.438597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.439024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.439041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.439057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.442282] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.442338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.443564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.444970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.445378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.445767] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.445813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.446194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.446238] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.179 [2024-07-12 16:10:03.446628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.446647] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.446664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.448482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.448527] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.448937] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.448982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.449367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.449415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.449802] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.449849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.450359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.450635] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.450651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.450666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.452489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.452543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.452588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.452633] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453146] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453190] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453579] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.453615] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.455301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.455346] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.455389] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.455439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.455981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456161] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.456604] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458594] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.458960] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459003] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459057] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459103] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459511] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459528] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.459543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.461484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.461529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.461574] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.461617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.461994] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462168] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462462] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.462477] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.464731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.464777] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.464820] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.464863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465229] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465271] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465843] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465868] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.465883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.467917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.467962] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468005] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468441] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468618] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468916] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.468931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.470669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.470719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.470764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.470810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471251] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471379] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471768] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.471803] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.473998] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474043] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474605] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474651] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474719] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.474763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.475223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.475240] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.475257] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.477696] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.477747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.477806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.477849] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.478395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.478446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.478501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.478556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.478599] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.479040] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.479060] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.180 [2024-07-12 16:10:03.479081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481227] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481270] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481706] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481757] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481810] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481865] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.481908] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.482360] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.482377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.482394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.484621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.484667] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.484713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.484756] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485243] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485330] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.485760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488187] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488657] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.488844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.489384] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.489403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.489422] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.491508] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.491554] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.491597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.491639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492020] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492217] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492744] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.492779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.494924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.494984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495601] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.495693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.496167] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.496184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.496199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.498986] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.499629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.501713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.502100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.502147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.502526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.502915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.502964] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503008] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503096] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503560] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.503584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.505814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.506197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.506242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.506622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.507022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.507074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.507457] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.507506] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.507889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.508354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.508370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.508390] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.510842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.511230] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.511274] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.511656] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.512044] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.512113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.512495] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.512540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.513752] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.514078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.514094] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.514117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.516109] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.181 [2024-07-12 16:10:03.516490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.516534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.518069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.518348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.518394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.519295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.519339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.520543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.520900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.520917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.520932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.523186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.524480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.524524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.525912] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.526306] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.526352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.527889] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.527933] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.529535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.529895] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.529922] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.529949] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.531981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.532936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.532981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.534184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.534535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.534592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.534975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.535021] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.535401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.535827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.535845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.535863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.537626] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.538492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.538545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.538929] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.539314] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.539361] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.539745] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.539791] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.541141] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.541418] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.541435] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.541450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.543509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.543894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.543941] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.544320] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.544597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.544643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.546241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.546287] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.547883] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.548322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.548338] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.548353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.550242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.550627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.550671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.551055] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.551334] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.551382] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.552981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.553027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.553987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.554263] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.554279] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.554294] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.556403] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.557795] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.557840] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.559284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.559677] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.559730] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.561329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.561376] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.562972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.563455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.563484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.563499] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.565471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.567073] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.567124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.567972] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.568250] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.568297] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.569347] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.182 [2024-07-12 16:10:03.569405] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.569790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.570178] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.570208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.570225] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.572085] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.573489] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.573534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.574670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.575125] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.575176] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.575556] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.575613] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.575999] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.576400] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.576417] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.576432] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.578086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.578483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.578529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.578913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.579322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.579371] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.580380] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.580424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.581819] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.582191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.582208] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.582223] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.584191] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.584576] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.584621] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.585509] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.585789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.585842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.586779] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.586824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.588419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.588698] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.588718] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.588733] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.591034] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.592636] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.592681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.594283] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.594812] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.594866] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.596366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.596411] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.597869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.598332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.598349] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.598365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.600308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.601661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.601721] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.601764] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.602137] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.602183] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.603584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.603628] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.604011] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.604412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.604430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.604448] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.607731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.609162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.609206] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.610598] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.610920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.612395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.612440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.613910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.613954] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.614419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.614439] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.614454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.617894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.619362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.620835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.183 [2024-07-12 16:10:03.621693] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.621982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.623577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.625054] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.626565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.627030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.627475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.627494] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.627512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.630824] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.632429] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.633504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.634798] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.635076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.636557] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.637703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.447 [2024-07-12 16:10:03.638091] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.638472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.638956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.638975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.638992] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.641797] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.643082] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.644555] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.646004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.646343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.646737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.647132] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.647512] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.648568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.648893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.648909] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.648924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.651984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.653461] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.654285] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.654668] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.655038] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.655425] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.656589] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.657863] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.659329] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.659607] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.659624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.659639] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.661968] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.662354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.662739] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.663118] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.663409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.664926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.666404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.667867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.668731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.669072] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.669088] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.669107] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.671367] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.671754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.673354] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.674789] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.675069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.676539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.677398] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.678778] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.680363] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.680642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.680660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.680675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.683654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.684938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.686410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.687876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.688182] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.689406] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.690713] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.692181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.693646] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.694029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.694047] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.694062] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.697659] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.699133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.700600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.702064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.702450] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.703738] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.705220] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.706686] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.708123] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.708564] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.708581] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.708597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.711981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.713452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.715013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.448 [2024-07-12 16:10:03.716116] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.716419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.717894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.719365] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.720875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.721256] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.721660] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.721678] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.721695] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.725113] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.726110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.727705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.729071] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.729348] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.730809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.731470] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.731856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.732241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.732629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.732648] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.732666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.735821] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.737339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.738809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.740276] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.740705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.741106] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.741488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.741875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.743247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.743603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.743619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.743634] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.746703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.748186] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.748666] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.749053] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.749472] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.749861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.751339] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.752643] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.754115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.754395] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.754412] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.754427] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.756501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.756892] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.757284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.758029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.758333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.759935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.761535] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.762430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.763715] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.763995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.764012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.764027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.767247] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.768523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.769996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.771241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.771606] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.772875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.774353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.775487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.775877] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.776269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.776286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.776303] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.779898] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.781295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.781880] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.782261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.782662] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.783050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.784108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.449 [2024-07-12 16:10:03.785488] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.786611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.786894] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.786911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.786926] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.789194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.789242] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.790770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.790816] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.791111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.792326] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.793529] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.794573] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.794963] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.795343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.795362] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.795377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.798100] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.798147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.799430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.799474] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.799755] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.800956] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801394] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801442] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801815] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801835] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.801851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.805138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.805185] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.806734] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.806790] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.807198] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.807584] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.807629] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.808026] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.808076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.808516] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.808533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.808547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.810870] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.810917] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.811308] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.811352] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.811627] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.813181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.813226] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.813948] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.813993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.814269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.814286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.814301] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.817497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.817543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.818931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.818975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.819453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.820936] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.820981] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.822463] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.822507] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.823018] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.823049] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.823064] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.826572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.826631] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.827453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.827501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.827780] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.828856] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.828905] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.829286] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.829333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.450 [2024-07-12 16:10:03.829675] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.829692] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.829707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.832921] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.832967] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.834192] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.834237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.834723] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.835108] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.835158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.835541] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.835587] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.836006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.836022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.836037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.838024] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.838074] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.838453] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.838497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.838891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.839807] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.839852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.841246] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.841291] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.841703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.841725] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.841753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.844078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.844126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.844925] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.844969] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.845245] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.846344] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.846391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.847855] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.847900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.848177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.848194] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.848210] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.851911] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.851957] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.853459] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.853504] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.854083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.855681] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.855732] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.857333] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.857378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.857753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.857770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.857786] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.860353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.860399] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.861438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.861483] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.861993] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.862378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.862430] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.862814] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.862861] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.863197] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.863214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.863234] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.865712] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.865761] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.866143] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.866199] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.866683] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.867070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.867115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.867497] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.867544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.867997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.868015] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.868030] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.870569] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.870617] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.871001] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.871046] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.451 [2024-07-12 16:10:03.871481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.871869] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.871913] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.872292] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.872337] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.872823] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.872844] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.872860] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.875443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.875490] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.875875] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.875920] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.876299] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.876684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.876735] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.877114] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.877162] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.877571] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.877588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.877602] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.880328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.880375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.880760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.880806] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.881189] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.881572] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.881619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.882013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.882063] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.882421] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.882438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.882464] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.885237] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.885284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.885664] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.886070] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.886551] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.886944] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.886989] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.887368] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.887413] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.887924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.887943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.887961] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.890041] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.890086] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.890475] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.452 [2024-07-12 16:10:03.890525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.890876] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.890924] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.891307] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.891353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.891737] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.892150] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.892166] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.892181] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894328] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894373] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894467] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894943] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.894990] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895076] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895524] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.895540] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.897975] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898022] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898067] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898480] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898526] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898568] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898611] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.898654] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.899039] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.899058] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.899077] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.900845] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.900891] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.900935] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.900980] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901355] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901401] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901443] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901532] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901915] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901932] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.901952] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.903838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.903884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.903931] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.903977] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904455] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904503] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904547] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904592] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904942] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904959] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.904974] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.906763] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.906808] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.906851] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.906893] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.907386] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.907446] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.907501] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.907544] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.907586] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.908013] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.908033] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.908050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910050] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910092] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910134] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910436] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910525] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910567] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910610] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.910987] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.911004] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.911019] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913241] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913290] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913351] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913393] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913724] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913770] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913867] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.913910] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.716 [2024-07-12 16:10:03.914184] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.914200] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.914214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.916558] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.916603] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.916649] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.916691] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.916970] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917016] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917059] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917111] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917154] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917521] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917537] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.917552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.919487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.919534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.919577] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.919622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.919997] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920048] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920090] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920135] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920454] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.920486] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.922484] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.922531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.922575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.922619] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923115] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923163] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923207] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923252] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923295] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923707] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923727] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.923743] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925493] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925538] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925580] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925622] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925900] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925946] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.925991] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.926035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.926078] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.926517] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.926534] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.926550] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928437] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928482] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928533] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928938] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.928984] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929069] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929117] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929409] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.929424] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.931624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.931669] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.931716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.931760] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932035] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932084] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932127] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932177] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932565] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932582] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.932597] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.934552] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.934600] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.934645] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.934694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.934982] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935029] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935081] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935124] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935173] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935496] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935513] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.935539] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937392] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937438] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937481] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937523] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937904] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937951] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.937995] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.938037] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.938080] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.938353] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.938370] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.717 [2024-07-12 16:10:03.938385] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.940158] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.940543] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.940588] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.940971] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941366] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941410] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941452] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941809] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941827] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.941842] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.943726] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.944110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.944156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.945415] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.945694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.945747] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.946375] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.946419] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.947813] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.948093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.948110] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.948126] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.950214] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.951661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.951705] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.952359] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.952637] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.952684] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.953884] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.953928] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.954321] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.954671] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.954688] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.954703] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.956460] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.957852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.957897] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.958404] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.958838] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.958887] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.959268] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.959313] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.960006] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.960284] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.960304] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.960319] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.961919] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.962396] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.962440] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.962822] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.963343] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.963391] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.964378] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.964423] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.965716] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.965996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.966012] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.966027] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.967670] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.969269] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.969315] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.969694] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.970093] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.970149] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.970531] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.970575] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.971776] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.972105] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.972121] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.972136] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.973852] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.975332] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.975377] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.976839] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.977261] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.977322] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.977704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.977754] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.978138] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.978471] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.978487] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.978502] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.980133] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.981469] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.981514] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.982996] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.983275] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.983323] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.983704] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.983753] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.984147] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.984624] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.984642] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:43.718 [2024-07-12 16:10:03.984661] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:33:44.289 00:33:44.289 Latency(us) 00:33:44.289 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.289 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x0 length 0x100 00:33:44.289 crypto_ram : 5.76 44.48 2.78 0.00 0.00 2793733.51 325865.16 2129415.88 00:33:44.289 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x100 length 0x100 00:33:44.289 crypto_ram : 6.01 42.57 2.66 0.00 0.00 2932016.05 203262.42 2503676.85 00:33:44.289 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x0 length 0x100 00:33:44.289 crypto_ram1 : 5.76 44.47 2.78 0.00 0.00 2697055.31 325865.16 1935832.62 00:33:44.289 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x100 length 0x100 00:33:44.289 crypto_ram1 : 6.01 42.56 2.66 0.00 0.00 2831238.70 202455.83 2297188.04 00:33:44.289 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x0 length 0x100 00:33:44.289 crypto_ram2 : 5.56 306.55 19.16 0.00 0.00 375838.70 15325.34 558165.07 00:33:44.289 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x100 length 0x100 00:33:44.289 crypto_ram2 : 5.60 254.38 15.90 0.00 0.00 450398.18 2886.10 596881.72 00:33:44.289 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x0 length 0x100 00:33:44.289 crypto_ram3 : 5.64 318.11 19.88 0.00 0.00 352693.53 13409.67 467826.22 00:33:44.289 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:44.289 Verification LBA range: start 0x100 length 0x100 00:33:44.289 crypto_ram3 : 5.73 267.24 16.70 0.00 0.00 417926.59 55251.89 464599.83 00:33:44.289 =================================================================================================================== 00:33:44.289 Total : 1320.36 82.52 0.00 0.00 726969.23 2886.10 2503676.85 00:33:44.549 00:33:44.549 real 0m8.887s 00:33:44.549 user 0m17.127s 00:33:44.549 sys 0m0.292s 00:33:44.549 16:10:04 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:44.549 16:10:04 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:44.550 ************************************ 00:33:44.550 END TEST bdev_verify_big_io 00:33:44.550 ************************************ 00:33:44.550 16:10:04 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:44.550 16:10:04 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:44.550 16:10:04 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:44.550 16:10:04 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:44.550 16:10:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:44.550 ************************************ 00:33:44.550 START TEST bdev_write_zeroes 00:33:44.550 ************************************ 00:33:44.550 16:10:04 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:44.550 [2024-07-12 16:10:04.996751] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:44.550 [2024-07-12 16:10:04.996795] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2739943 ] 00:33:44.810 [2024-07-12 16:10:05.085448] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:44.810 [2024-07-12 16:10:05.154433] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:44.810 [2024-07-12 16:10:05.175448] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:44.810 [2024-07-12 16:10:05.183474] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:44.810 [2024-07-12 16:10:05.191497] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:45.071 [2024-07-12 16:10:05.275378] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:47.612 [2024-07-12 16:10:07.431635] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:47.612 [2024-07-12 16:10:07.431686] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:47.612 [2024-07-12 16:10:07.431694] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:47.612 [2024-07-12 16:10:07.439653] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:47.612 [2024-07-12 16:10:07.439665] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:47.612 [2024-07-12 16:10:07.439671] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:47.612 [2024-07-12 16:10:07.447673] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:47.612 [2024-07-12 16:10:07.447684] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:47.612 [2024-07-12 16:10:07.447690] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:47.612 [2024-07-12 16:10:07.455693] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:47.612 [2024-07-12 16:10:07.455703] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:47.612 [2024-07-12 16:10:07.455713] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:47.612 Running I/O for 1 seconds... 00:33:48.179 00:33:48.179 Latency(us) 00:33:48.179 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:48.179 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:48.179 crypto_ram : 1.02 2383.01 9.31 0.00 0.00 53412.35 4864.79 64527.75 00:33:48.179 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:48.179 crypto_ram1 : 1.02 2396.20 9.36 0.00 0.00 52896.52 4814.38 59688.17 00:33:48.179 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:48.179 crypto_ram2 : 1.02 18455.22 72.09 0.00 0.00 6853.27 2129.92 9074.22 00:33:48.179 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:48.179 crypto_ram3 : 1.02 18487.59 72.22 0.00 0.00 6822.19 2117.32 7108.14 00:33:48.179 =================================================================================================================== 00:33:48.179 Total : 41722.02 162.98 0.00 0.00 12162.37 2117.32 64527.75 00:33:48.438 00:33:48.438 real 0m3.860s 00:33:48.438 user 0m3.579s 00:33:48.438 sys 0m0.243s 00:33:48.438 16:10:08 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:48.438 16:10:08 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:48.438 ************************************ 00:33:48.438 END TEST bdev_write_zeroes 00:33:48.438 ************************************ 00:33:48.438 16:10:08 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:33:48.438 16:10:08 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:48.438 16:10:08 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:48.438 16:10:08 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:48.438 16:10:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:48.438 ************************************ 00:33:48.438 START TEST bdev_json_nonenclosed 00:33:48.438 ************************************ 00:33:48.438 16:10:08 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:48.697 [2024-07-12 16:10:08.931601] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:48.697 [2024-07-12 16:10:08.931650] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2740573 ] 00:33:48.697 [2024-07-12 16:10:09.021858] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:48.697 [2024-07-12 16:10:09.096579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:48.697 [2024-07-12 16:10:09.096633] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:48.697 [2024-07-12 16:10:09.096644] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:48.697 [2024-07-12 16:10:09.096653] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:48.956 00:33:48.956 real 0m0.280s 00:33:48.956 user 0m0.168s 00:33:48.956 sys 0m0.109s 00:33:48.956 16:10:09 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:33:48.956 16:10:09 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:48.956 16:10:09 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:48.956 ************************************ 00:33:48.956 END TEST bdev_json_nonenclosed 00:33:48.956 ************************************ 00:33:48.956 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:33:48.956 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:33:48.956 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:48.956 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:33:48.956 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:48.956 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:48.956 ************************************ 00:33:48.956 START TEST bdev_json_nonarray 00:33:48.956 ************************************ 00:33:48.956 16:10:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:48.956 [2024-07-12 16:10:09.290046] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:48.956 [2024-07-12 16:10:09.290097] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2740615 ] 00:33:48.956 [2024-07-12 16:10:09.378892] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:49.216 [2024-07-12 16:10:09.453865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:49.216 [2024-07-12 16:10:09.453924] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:49.216 [2024-07-12 16:10:09.453936] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:49.216 [2024-07-12 16:10:09.453943] app.c:1057:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:49.216 00:33:49.216 real 0m0.277s 00:33:49.216 user 0m0.180s 00:33:49.216 sys 0m0.096s 00:33:49.216 16:10:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:33:49.216 16:10:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:49.216 16:10:09 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:49.216 ************************************ 00:33:49.216 END TEST bdev_json_nonarray 00:33:49.216 ************************************ 00:33:49.216 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:33:49.216 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:49.217 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:49.217 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:33:49.217 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:33:49.217 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:33:49.217 16:10:09 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:33:49.217 00:33:49.217 real 1m10.907s 00:33:49.217 user 2m54.141s 00:33:49.217 sys 0m6.726s 00:33:49.217 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:49.217 16:10:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:49.217 ************************************ 00:33:49.217 END TEST blockdev_crypto_qat 00:33:49.217 ************************************ 00:33:49.217 16:10:09 -- common/autotest_common.sh@1142 -- # return 0 00:33:49.217 16:10:09 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:49.217 16:10:09 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:33:49.217 16:10:09 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:49.217 16:10:09 -- common/autotest_common.sh@10 -- # set +x 00:33:49.217 ************************************ 00:33:49.217 START TEST chaining 00:33:49.217 ************************************ 00:33:49.217 16:10:09 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:49.476 * Looking for test storage... 00:33:49.476 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:49.476 16:10:09 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:33:49.476 16:10:09 chaining -- nvmf/common.sh@7 -- # uname -s 00:33:49.476 16:10:09 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:33:49.476 16:10:09 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:49.477 16:10:09 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:33:49.477 16:10:09 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:49.477 16:10:09 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:49.477 16:10:09 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:49.477 16:10:09 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:49.477 16:10:09 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:49.477 16:10:09 chaining -- paths/export.sh@5 -- # export PATH 00:33:49.477 16:10:09 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@47 -- # : 0 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:33:49.477 16:10:09 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:49.477 16:10:09 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:49.477 16:10:09 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:33:49.477 16:10:09 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:33:49.477 16:10:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@296 -- # e810=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@297 -- # x722=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@298 -- # mlx=() 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:33:57.610 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:33:57.610 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:33:57.610 Found net devices under 0000:4b:00.0: cvl_0_0 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:33:57.610 Found net devices under 0000:4b:00.1: cvl_0_1 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:33:57.610 16:10:17 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:33:57.871 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:33:57.871 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.551 ms 00:33:57.871 00:33:57.871 --- 10.0.0.2 ping statistics --- 00:33:57.871 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:57.871 rtt min/avg/max/mdev = 0.551/0.551/0.551/0.000 ms 00:33:57.871 16:10:18 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:33:57.871 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:33:57.872 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.309 ms 00:33:57.872 00:33:57.872 --- 10.0.0.1 ping statistics --- 00:33:57.872 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:57.872 rtt min/avg/max/mdev = 0.309/0.309/0.309/0.000 ms 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@422 -- # return 0 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:33:57.872 16:10:18 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@481 -- # nvmfpid=2744850 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@482 -- # waitforlisten 2744850 00:33:57.872 16:10:18 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@829 -- # '[' -z 2744850 ']' 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:57.872 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:57.872 16:10:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:58.132 [2024-07-12 16:10:18.354848] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:58.132 [2024-07-12 16:10:18.354909] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:58.132 [2024-07-12 16:10:18.442460] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.132 [2024-07-12 16:10:18.543358] app.c: 607:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:33:58.132 [2024-07-12 16:10:18.543411] app.c: 608:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:33:58.132 [2024-07-12 16:10:18.543425] app.c: 613:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:33:58.132 [2024-07-12 16:10:18.543435] app.c: 614:app_setup_trace: *NOTICE*: SPDK application currently running. 00:33:58.132 [2024-07-12 16:10:18.543445] app.c: 615:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:33:58.132 [2024-07-12 16:10:18.543478] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@862 -- # return 0 00:33:59.519 16:10:19 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 16:10:19 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.xg6rq5iozE 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.3MViob2srs 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 malloc0 00:33:59.519 true 00:33:59.519 true 00:33:59.519 [2024-07-12 16:10:19.647690] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:59.519 crypto0 00:33:59.519 [2024-07-12 16:10:19.655725] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:59.519 crypto1 00:33:59.519 [2024-07-12 16:10:19.663870] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:33:59.519 [2024-07-12 16:10:19.680064] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@85 -- # update_stats 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.519 16:10:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.xg6rq5iozE bs=1K count=64 00:33:59.519 64+0 records in 00:33:59.519 64+0 records out 00:33:59.519 65536 bytes (66 kB, 64 KiB) copied, 0.00100012 s, 65.5 MB/s 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.xg6rq5iozE --ob Nvme0n1 --bs 65536 --count 1 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@25 -- # local config 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:59.519 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:59.519 "subsystems": [ 00:33:59.519 { 00:33:59.519 "subsystem": "bdev", 00:33:59.519 "config": [ 00:33:59.519 { 00:33:59.519 "method": "bdev_nvme_attach_controller", 00:33:59.519 "params": { 00:33:59.519 "trtype": "tcp", 00:33:59.519 "adrfam": "IPv4", 00:33:59.519 "name": "Nvme0", 00:33:59.519 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:59.519 "traddr": "10.0.0.2", 00:33:59.519 "trsvcid": "4420" 00:33:59.519 } 00:33:59.519 }, 00:33:59.519 { 00:33:59.519 "method": "bdev_set_options", 00:33:59.519 "params": { 00:33:59.519 "bdev_auto_examine": false 00:33:59.519 } 00:33:59.519 } 00:33:59.519 ] 00:33:59.519 } 00:33:59.519 ] 00:33:59.519 }' 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.xg6rq5iozE --ob Nvme0n1 --bs 65536 --count 1 00:33:59.519 16:10:19 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:59.519 "subsystems": [ 00:33:59.519 { 00:33:59.519 "subsystem": "bdev", 00:33:59.519 "config": [ 00:33:59.519 { 00:33:59.519 "method": "bdev_nvme_attach_controller", 00:33:59.519 "params": { 00:33:59.519 "trtype": "tcp", 00:33:59.519 "adrfam": "IPv4", 00:33:59.519 "name": "Nvme0", 00:33:59.519 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:59.519 "traddr": "10.0.0.2", 00:33:59.519 "trsvcid": "4420" 00:33:59.519 } 00:33:59.519 }, 00:33:59.519 { 00:33:59.519 "method": "bdev_set_options", 00:33:59.519 "params": { 00:33:59.519 "bdev_auto_examine": false 00:33:59.519 } 00:33:59.519 } 00:33:59.519 ] 00:33:59.519 } 00:33:59.519 ] 00:33:59.519 }' 00:33:59.779 [2024-07-12 16:10:20.042632] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:33:59.779 [2024-07-12 16:10:20.042771] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745206 ] 00:33:59.779 [2024-07-12 16:10:20.187396] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:00.040 [2024-07-12 16:10:20.281561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:00.301  Copying: 64/64 [kB] (average 20 MBps) 00:34:00.301 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:00.301 16:10:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:00.301 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.301 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.301 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:00.562 16:10:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.562 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@96 -- # update_stats 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.563 16:10:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.563 16:10:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:00.825 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.3MViob2srs --ib Nvme0n1 --bs 65536 --count 1 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@25 -- # local config 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:00.825 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:00.825 "subsystems": [ 00:34:00.825 { 00:34:00.825 "subsystem": "bdev", 00:34:00.825 "config": [ 00:34:00.825 { 00:34:00.825 "method": "bdev_nvme_attach_controller", 00:34:00.825 "params": { 00:34:00.825 "trtype": "tcp", 00:34:00.825 "adrfam": "IPv4", 00:34:00.825 "name": "Nvme0", 00:34:00.825 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:00.825 "traddr": "10.0.0.2", 00:34:00.825 "trsvcid": "4420" 00:34:00.825 } 00:34:00.825 }, 00:34:00.825 { 00:34:00.825 "method": "bdev_set_options", 00:34:00.825 "params": { 00:34:00.825 "bdev_auto_examine": false 00:34:00.825 } 00:34:00.825 } 00:34:00.825 ] 00:34:00.825 } 00:34:00.825 ] 00:34:00.825 }' 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3MViob2srs --ib Nvme0n1 --bs 65536 --count 1 00:34:00.825 16:10:21 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:00.825 "subsystems": [ 00:34:00.825 { 00:34:00.825 "subsystem": "bdev", 00:34:00.825 "config": [ 00:34:00.825 { 00:34:00.825 "method": "bdev_nvme_attach_controller", 00:34:00.825 "params": { 00:34:00.825 "trtype": "tcp", 00:34:00.825 "adrfam": "IPv4", 00:34:00.825 "name": "Nvme0", 00:34:00.825 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:00.825 "traddr": "10.0.0.2", 00:34:00.825 "trsvcid": "4420" 00:34:00.825 } 00:34:00.825 }, 00:34:00.825 { 00:34:00.825 "method": "bdev_set_options", 00:34:00.825 "params": { 00:34:00.825 "bdev_auto_examine": false 00:34:00.825 } 00:34:00.825 } 00:34:00.825 ] 00:34:00.825 } 00:34:00.825 ] 00:34:00.825 }' 00:34:00.825 [2024-07-12 16:10:21.263431] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:00.825 [2024-07-12 16:10:21.263498] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745355 ] 00:34:01.087 [2024-07-12 16:10:21.357580] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:01.087 [2024-07-12 16:10:21.452313] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:01.609  Copying: 64/64 [kB] (average 20 MBps) 00:34:01.609 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:01.609 16:10:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:01.609 16:10:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.xg6rq5iozE /tmp/tmp.3MViob2srs 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@25 -- # local config 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:01.609 16:10:22 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:01.609 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:01.869 16:10:22 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:01.869 "subsystems": [ 00:34:01.869 { 00:34:01.870 "subsystem": "bdev", 00:34:01.870 "config": [ 00:34:01.870 { 00:34:01.870 "method": "bdev_nvme_attach_controller", 00:34:01.870 "params": { 00:34:01.870 "trtype": "tcp", 00:34:01.870 "adrfam": "IPv4", 00:34:01.870 "name": "Nvme0", 00:34:01.870 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:01.870 "traddr": "10.0.0.2", 00:34:01.870 "trsvcid": "4420" 00:34:01.870 } 00:34:01.870 }, 00:34:01.870 { 00:34:01.870 "method": "bdev_set_options", 00:34:01.870 "params": { 00:34:01.870 "bdev_auto_examine": false 00:34:01.870 } 00:34:01.870 } 00:34:01.870 ] 00:34:01.870 } 00:34:01.870 ] 00:34:01.870 }' 00:34:01.870 16:10:22 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:34:01.870 16:10:22 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:01.870 "subsystems": [ 00:34:01.870 { 00:34:01.870 "subsystem": "bdev", 00:34:01.870 "config": [ 00:34:01.870 { 00:34:01.870 "method": "bdev_nvme_attach_controller", 00:34:01.870 "params": { 00:34:01.870 "trtype": "tcp", 00:34:01.870 "adrfam": "IPv4", 00:34:01.870 "name": "Nvme0", 00:34:01.870 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:01.870 "traddr": "10.0.0.2", 00:34:01.870 "trsvcid": "4420" 00:34:01.870 } 00:34:01.870 }, 00:34:01.870 { 00:34:01.870 "method": "bdev_set_options", 00:34:01.870 "params": { 00:34:01.870 "bdev_auto_examine": false 00:34:01.870 } 00:34:01.870 } 00:34:01.870 ] 00:34:01.870 } 00:34:01.870 ] 00:34:01.870 }' 00:34:01.870 [2024-07-12 16:10:22.137299] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:01.870 [2024-07-12 16:10:22.137367] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745573 ] 00:34:01.870 [2024-07-12 16:10:22.210092] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:01.870 [2024-07-12 16:10:22.303113] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:02.442  Copying: 64/64 [kB] (average 20 MBps) 00:34:02.442 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@106 -- # update_stats 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:02.442 16:10:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.442 16:10:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:02.442 16:10:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:02.442 16:10:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:02.703 16:10:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.703 16:10:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:02.703 16:10:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:02.703 16:10:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:02.703 16:10:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.703 16:10:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:02.703 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:02.703 16:10:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:02.703 16:10:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:02.703 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.xg6rq5iozE --ob Nvme0n1 --bs 4096 --count 16 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@25 -- # local config 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:02.703 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:02.703 "subsystems": [ 00:34:02.703 { 00:34:02.703 "subsystem": "bdev", 00:34:02.703 "config": [ 00:34:02.703 { 00:34:02.703 "method": "bdev_nvme_attach_controller", 00:34:02.703 "params": { 00:34:02.703 "trtype": "tcp", 00:34:02.703 "adrfam": "IPv4", 00:34:02.703 "name": "Nvme0", 00:34:02.703 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:02.703 "traddr": "10.0.0.2", 00:34:02.703 "trsvcid": "4420" 00:34:02.703 } 00:34:02.703 }, 00:34:02.703 { 00:34:02.703 "method": "bdev_set_options", 00:34:02.703 "params": { 00:34:02.703 "bdev_auto_examine": false 00:34:02.703 } 00:34:02.703 } 00:34:02.703 ] 00:34:02.703 } 00:34:02.703 ] 00:34:02.703 }' 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.xg6rq5iozE --ob Nvme0n1 --bs 4096 --count 16 00:34:02.703 16:10:23 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:02.703 "subsystems": [ 00:34:02.703 { 00:34:02.703 "subsystem": "bdev", 00:34:02.703 "config": [ 00:34:02.703 { 00:34:02.703 "method": "bdev_nvme_attach_controller", 00:34:02.703 "params": { 00:34:02.703 "trtype": "tcp", 00:34:02.703 "adrfam": "IPv4", 00:34:02.703 "name": "Nvme0", 00:34:02.703 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:02.703 "traddr": "10.0.0.2", 00:34:02.703 "trsvcid": "4420" 00:34:02.703 } 00:34:02.703 }, 00:34:02.703 { 00:34:02.703 "method": "bdev_set_options", 00:34:02.703 "params": { 00:34:02.703 "bdev_auto_examine": false 00:34:02.703 } 00:34:02.703 } 00:34:02.703 ] 00:34:02.703 } 00:34:02.703 ] 00:34:02.703 }' 00:34:02.964 [2024-07-12 16:10:23.166451] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:02.964 [2024-07-12 16:10:23.166518] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745700 ] 00:34:02.964 [2024-07-12 16:10:23.256508] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:02.964 [2024-07-12 16:10:23.350018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:03.483  Copying: 64/64 [kB] (average 12 MBps) 00:34:03.483 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.483 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.483 16:10:23 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:34:03.743 16:10:23 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:23 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:23 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@114 -- # update_stats 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:03.744 16:10:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:03.744 16:10:24 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@117 -- # : 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.3MViob2srs --ib Nvme0n1 --bs 4096 --count 16 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@25 -- # local config 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:34:04.004 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@31 -- # config='{ 00:34:04.004 "subsystems": [ 00:34:04.004 { 00:34:04.004 "subsystem": "bdev", 00:34:04.004 "config": [ 00:34:04.004 { 00:34:04.004 "method": "bdev_nvme_attach_controller", 00:34:04.004 "params": { 00:34:04.004 "trtype": "tcp", 00:34:04.004 "adrfam": "IPv4", 00:34:04.004 "name": "Nvme0", 00:34:04.004 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:04.004 "traddr": "10.0.0.2", 00:34:04.004 "trsvcid": "4420" 00:34:04.004 } 00:34:04.004 }, 00:34:04.004 { 00:34:04.004 "method": "bdev_set_options", 00:34:04.004 "params": { 00:34:04.004 "bdev_auto_examine": false 00:34:04.004 } 00:34:04.004 } 00:34:04.004 ] 00:34:04.004 } 00:34:04.004 ] 00:34:04.004 }' 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.3MViob2srs --ib Nvme0n1 --bs 4096 --count 16 00:34:04.004 16:10:24 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:34:04.004 "subsystems": [ 00:34:04.004 { 00:34:04.004 "subsystem": "bdev", 00:34:04.004 "config": [ 00:34:04.004 { 00:34:04.004 "method": "bdev_nvme_attach_controller", 00:34:04.004 "params": { 00:34:04.004 "trtype": "tcp", 00:34:04.004 "adrfam": "IPv4", 00:34:04.004 "name": "Nvme0", 00:34:04.004 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:34:04.004 "traddr": "10.0.0.2", 00:34:04.004 "trsvcid": "4420" 00:34:04.004 } 00:34:04.004 }, 00:34:04.004 { 00:34:04.004 "method": "bdev_set_options", 00:34:04.004 "params": { 00:34:04.004 "bdev_auto_examine": false 00:34:04.004 } 00:34:04.004 } 00:34:04.004 ] 00:34:04.004 } 00:34:04.004 ] 00:34:04.004 }' 00:34:04.004 [2024-07-12 16:10:24.311623] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:04.004 [2024-07-12 16:10:24.311689] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2745941 ] 00:34:04.004 [2024-07-12 16:10:24.403616] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:04.263 [2024-07-12 16:10:24.496073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:04.792  Copying: 64/64 [kB] (average 680 kBps) 00:34:04.792 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:04.792 16:10:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.792 16:10:25 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.xg6rq5iozE /tmp/tmp.3MViob2srs 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.xg6rq5iozE /tmp/tmp.3MViob2srs 00:34:05.092 16:10:25 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@117 -- # sync 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@120 -- # set +e 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:05.092 rmmod nvme_tcp 00:34:05.092 rmmod nvme_fabrics 00:34:05.092 rmmod nvme_keyring 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@124 -- # set -e 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@125 -- # return 0 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@489 -- # '[' -n 2744850 ']' 00:34:05.092 16:10:25 chaining -- nvmf/common.sh@490 -- # killprocess 2744850 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@948 -- # '[' -z 2744850 ']' 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@952 -- # kill -0 2744850 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@953 -- # uname 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2744850 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2744850' 00:34:05.092 killing process with pid 2744850 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@967 -- # kill 2744850 00:34:05.092 16:10:25 chaining -- common/autotest_common.sh@972 -- # wait 2744850 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:05.353 16:10:25 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:05.353 16:10:25 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:05.353 16:10:25 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:07.898 16:10:27 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:34:07.898 16:10:27 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:34:07.898 16:10:27 chaining -- bdev/chaining.sh@132 -- # bperfpid=2746608 00:34:07.898 16:10:27 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2746608 00:34:07.898 16:10:27 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@829 -- # '[' -z 2746608 ']' 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:07.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:07.898 16:10:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:07.898 [2024-07-12 16:10:27.801134] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:07.898 [2024-07-12 16:10:27.801238] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2746608 ] 00:34:07.898 [2024-07-12 16:10:27.894538] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:07.898 [2024-07-12 16:10:27.987903] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:08.840 16:10:29 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:08.840 16:10:29 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:08.840 16:10:29 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:34:08.840 16:10:29 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:08.840 16:10:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:08.840 malloc0 00:34:08.840 true 00:34:08.840 true 00:34:08.840 [2024-07-12 16:10:29.145562] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:08.840 crypto0 00:34:08.840 [2024-07-12 16:10:29.153588] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:08.840 crypto1 00:34:08.840 16:10:29 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:08.840 16:10:29 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:09.100 Running I/O for 5 seconds... 00:34:14.385 00:34:14.385 Latency(us) 00:34:14.385 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:14.385 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:14.385 Verification LBA range: start 0x0 length 0x2000 00:34:14.385 crypto1 : 5.01 14278.08 55.77 0.00 0.00 17879.88 2079.51 12804.73 00:34:14.385 =================================================================================================================== 00:34:14.385 Total : 14278.08 55.77 0.00 0.00 17879.88 2079.51 12804.73 00:34:14.385 0 00:34:14.385 16:10:34 chaining -- bdev/chaining.sh@146 -- # killprocess 2746608 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@948 -- # '[' -z 2746608 ']' 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@952 -- # kill -0 2746608 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@953 -- # uname 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2746608 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2746608' 00:34:14.385 killing process with pid 2746608 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@967 -- # kill 2746608 00:34:14.385 Received shutdown signal, test time was about 5.000000 seconds 00:34:14.385 00:34:14.385 Latency(us) 00:34:14.385 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:14.385 =================================================================================================================== 00:34:14.385 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@972 -- # wait 2746608 00:34:14.385 16:10:34 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:14.385 16:10:34 chaining -- bdev/chaining.sh@152 -- # bperfpid=2747572 00:34:14.385 16:10:34 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2747572 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@829 -- # '[' -z 2747572 ']' 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:14.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:14.385 16:10:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.385 [2024-07-12 16:10:34.653061] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:14.385 [2024-07-12 16:10:34.653110] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2747572 ] 00:34:14.385 [2024-07-12 16:10:34.718868] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:14.385 [2024-07-12 16:10:34.780934] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:14.645 16:10:34 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:14.645 16:10:34 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:14.645 16:10:34 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:34:14.645 16:10:34 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:14.645 16:10:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:14.645 malloc0 00:34:14.645 true 00:34:14.645 true 00:34:14.645 [2024-07-12 16:10:34.957575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:34:14.645 [2024-07-12 16:10:34.957610] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:34:14.645 [2024-07-12 16:10:34.957622] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x212cca0 00:34:14.645 [2024-07-12 16:10:34.957628] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:34:14.645 [2024-07-12 16:10:34.958500] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:34:14.645 [2024-07-12 16:10:34.958518] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:34:14.645 pt0 00:34:14.645 [2024-07-12 16:10:34.965603] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:14.645 crypto0 00:34:14.645 [2024-07-12 16:10:34.973621] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:34:14.645 crypto1 00:34:14.645 16:10:34 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:14.646 16:10:34 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:14.905 Running I/O for 5 seconds... 00:34:20.187 00:34:20.187 Latency(us) 00:34:20.187 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:20.187 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:20.187 Verification LBA range: start 0x0 length 0x2000 00:34:20.187 crypto1 : 5.02 11268.83 44.02 0.00 0.00 22657.34 5268.09 13712.15 00:34:20.187 =================================================================================================================== 00:34:20.187 Total : 11268.83 44.02 0.00 0.00 22657.34 5268.09 13712.15 00:34:20.187 0 00:34:20.187 16:10:40 chaining -- bdev/chaining.sh@167 -- # killprocess 2747572 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@948 -- # '[' -z 2747572 ']' 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@952 -- # kill -0 2747572 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@953 -- # uname 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2747572 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2747572' 00:34:20.187 killing process with pid 2747572 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@967 -- # kill 2747572 00:34:20.187 Received shutdown signal, test time was about 5.000000 seconds 00:34:20.187 00:34:20.187 Latency(us) 00:34:20.187 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:20.187 =================================================================================================================== 00:34:20.187 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@972 -- # wait 2747572 00:34:20.187 16:10:40 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:34:20.187 16:10:40 chaining -- bdev/chaining.sh@170 -- # killprocess 2747572 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@948 -- # '[' -z 2747572 ']' 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@952 -- # kill -0 2747572 00:34:20.187 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (2747572) - No such process 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 2747572 is not found' 00:34:20.187 Process with pid 2747572 is not found 00:34:20.187 16:10:40 chaining -- bdev/chaining.sh@171 -- # wait 2747572 00:34:20.187 16:10:40 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:20.187 16:10:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:20.187 16:10:40 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:34:20.188 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:34:20.188 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:34:20.188 Found net devices under 0000:4b:00.0: cvl_0_0 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:34:20.188 Found net devices under 0000:4b:00.1: cvl_0_1 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:34:20.188 16:10:40 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:34:20.449 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:20.449 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.633 ms 00:34:20.449 00:34:20.449 --- 10.0.0.2 ping statistics --- 00:34:20.449 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:20.449 rtt min/avg/max/mdev = 0.633/0.633/0.633/0.000 ms 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:34:20.449 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:20.449 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.306 ms 00:34:20.449 00:34:20.449 --- 10.0.0.1 ping statistics --- 00:34:20.449 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:20.449 rtt min/avg/max/mdev = 0.306/0.306/0.306/0.000 ms 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@422 -- # return 0 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:20.449 16:10:40 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@481 -- # nvmfpid=2748760 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@482 -- # waitforlisten 2748760 00:34:20.449 16:10:40 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@829 -- # '[' -z 2748760 ']' 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:20.449 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:20.449 16:10:40 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:20.709 [2024-07-12 16:10:40.909485] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:20.709 [2024-07-12 16:10:40.909550] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:20.709 [2024-07-12 16:10:40.996872] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:20.709 [2024-07-12 16:10:41.095454] app.c: 607:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:20.709 [2024-07-12 16:10:41.095515] app.c: 608:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:20.709 [2024-07-12 16:10:41.095528] app.c: 613:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:20.709 [2024-07-12 16:10:41.095540] app.c: 614:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:20.709 [2024-07-12 16:10:41.095549] app.c: 615:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:20.709 [2024-07-12 16:10:41.095581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:21.649 16:10:41 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:21.649 16:10:41 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:21.649 16:10:41 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:21.649 malloc0 00:34:21.649 [2024-07-12 16:10:41.806206] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:21.649 [2024-07-12 16:10:41.822363] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:21.649 16:10:41 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:34:21.649 16:10:41 chaining -- bdev/chaining.sh@189 -- # bperfpid=2748801 00:34:21.649 16:10:41 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2748801 /var/tmp/bperf.sock 00:34:21.649 16:10:41 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@829 -- # '[' -z 2748801 ']' 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:21.649 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:21.649 16:10:41 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:21.649 [2024-07-12 16:10:41.903964] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:21.649 [2024-07-12 16:10:41.904023] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2748801 ] 00:34:21.649 [2024-07-12 16:10:41.982524] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:21.649 [2024-07-12 16:10:42.058138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:22.218 16:10:42 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:22.218 16:10:42 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:22.218 16:10:42 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:34:22.218 16:10:42 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:22.788 [2024-07-12 16:10:43.055035] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:22.788 nvme0n1 00:34:22.788 true 00:34:22.788 crypto0 00:34:22.788 16:10:43 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:22.788 Running I/O for 5 seconds... 00:34:28.072 00:34:28.072 Latency(us) 00:34:28.072 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:28.072 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:28.072 Verification LBA range: start 0x0 length 0x2000 00:34:28.072 crypto0 : 5.02 8872.19 34.66 0.00 0.00 28770.03 4310.25 24399.56 00:34:28.072 =================================================================================================================== 00:34:28.072 Total : 8872.19 34.66 0.00 0.00 28770.03 4310.25 24399.56 00:34:28.072 0 00:34:28.072 16:10:48 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:34:28.072 16:10:48 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:28.072 16:10:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.072 16:10:48 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@205 -- # sequence=89082 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:28.073 16:10:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@206 -- # encrypt=44541 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:28.333 16:10:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@207 -- # decrypt=44541 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:28.593 16:10:48 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@208 -- # crc32c=89082 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@214 -- # killprocess 2748801 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@948 -- # '[' -z 2748801 ']' 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@952 -- # kill -0 2748801 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@953 -- # uname 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2748801 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2748801' 00:34:28.853 killing process with pid 2748801 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@967 -- # kill 2748801 00:34:28.853 Received shutdown signal, test time was about 5.000000 seconds 00:34:28.853 00:34:28.853 Latency(us) 00:34:28.853 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:28.853 =================================================================================================================== 00:34:28.853 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@972 -- # wait 2748801 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@219 -- # bperfpid=2750047 00:34:28.853 16:10:49 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2750047 /var/tmp/bperf.sock 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@829 -- # '[' -z 2750047 ']' 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:28.853 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:28.853 16:10:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:28.853 [2024-07-12 16:10:49.298620] Starting SPDK v24.09-pre git sha1 be7837808 / DPDK 24.03.0 initialization... 00:34:28.853 [2024-07-12 16:10:49.298685] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2750047 ] 00:34:29.113 [2024-07-12 16:10:49.374919] app.c: 913:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:29.113 [2024-07-12 16:10:49.437588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:29.373 16:10:49 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:29.373 16:10:49 chaining -- common/autotest_common.sh@862 -- # return 0 00:34:29.373 16:10:49 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:34:29.373 16:10:49 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:30.313 [2024-07-12 16:10:50.485898] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:30.313 nvme0n1 00:34:30.313 true 00:34:30.313 crypto0 00:34:30.313 16:10:50 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:30.313 Running I/O for 5 seconds... 00:34:35.591 00:34:35.591 Latency(us) 00:34:35.591 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:35.591 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:34:35.591 Verification LBA range: start 0x0 length 0x200 00:34:35.591 crypto0 : 5.00 2265.14 141.57 0.00 0.00 13834.78 519.88 16535.24 00:34:35.591 =================================================================================================================== 00:34:35.591 Total : 2265.14 141.57 0.00 0.00 13834.78 519.88 16535.24 00:34:35.591 0 00:34:35.591 16:10:55 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:34:35.591 16:10:55 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@233 -- # sequence=22668 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:35.592 16:10:55 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@234 -- # encrypt=11334 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:35.851 16:10:56 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:35.852 16:10:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:35.852 16:10:56 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@235 -- # decrypt=11334 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:36.111 16:10:56 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:36.112 16:10:56 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:36.112 16:10:56 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:36.112 16:10:56 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:36.112 16:10:56 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@236 -- # crc32c=22668 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@242 -- # killprocess 2750047 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@948 -- # '[' -z 2750047 ']' 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@952 -- # kill -0 2750047 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@953 -- # uname 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2750047 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2750047' 00:34:36.372 killing process with pid 2750047 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@967 -- # kill 2750047 00:34:36.372 Received shutdown signal, test time was about 5.000000 seconds 00:34:36.372 00:34:36.372 Latency(us) 00:34:36.372 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:36.372 =================================================================================================================== 00:34:36.372 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:36.372 16:10:56 chaining -- common/autotest_common.sh@972 -- # wait 2750047 00:34:36.372 16:10:56 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@117 -- # sync 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@120 -- # set +e 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:36.372 16:10:56 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:36.372 rmmod nvme_tcp 00:34:36.372 rmmod nvme_fabrics 00:34:36.658 rmmod nvme_keyring 00:34:36.658 16:10:56 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:36.658 16:10:56 chaining -- nvmf/common.sh@124 -- # set -e 00:34:36.658 16:10:56 chaining -- nvmf/common.sh@125 -- # return 0 00:34:36.658 16:10:56 chaining -- nvmf/common.sh@489 -- # '[' -n 2748760 ']' 00:34:36.658 16:10:56 chaining -- nvmf/common.sh@490 -- # killprocess 2748760 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@948 -- # '[' -z 2748760 ']' 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@952 -- # kill -0 2748760 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@953 -- # uname 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 2748760 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 2748760' 00:34:36.658 killing process with pid 2748760 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@967 -- # kill 2748760 00:34:36.658 16:10:56 chaining -- common/autotest_common.sh@972 -- # wait 2748760 00:34:36.922 16:10:57 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:36.922 16:10:57 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:36.922 16:10:57 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:36.923 16:10:57 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:36.923 16:10:57 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:36.923 16:10:57 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:36.923 16:10:57 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:36.923 16:10:57 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:38.834 16:10:59 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:34:38.834 16:10:59 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:34:38.834 00:34:38.834 real 0m49.550s 00:34:38.834 user 1m1.908s 00:34:38.834 sys 0m11.497s 00:34:38.834 16:10:59 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:38.834 16:10:59 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:38.834 ************************************ 00:34:38.834 END TEST chaining 00:34:38.834 ************************************ 00:34:38.834 16:10:59 -- common/autotest_common.sh@1142 -- # return 0 00:34:38.834 16:10:59 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:34:38.834 16:10:59 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:34:38.834 16:10:59 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:34:38.834 16:10:59 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:34:38.834 16:10:59 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:34:38.834 16:10:59 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:34:38.834 16:10:59 -- common/autotest_common.sh@722 -- # xtrace_disable 00:34:38.834 16:10:59 -- common/autotest_common.sh@10 -- # set +x 00:34:38.834 16:10:59 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:34:38.834 16:10:59 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:34:38.834 16:10:59 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:34:38.834 16:10:59 -- common/autotest_common.sh@10 -- # set +x 00:34:45.423 INFO: APP EXITING 00:34:45.423 INFO: killing all VMs 00:34:45.423 INFO: killing vhost app 00:34:45.423 INFO: EXIT DONE 00:34:49.621 Waiting for block devices as requested 00:34:49.621 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:34:49.621 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:34:49.621 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:34:49.881 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:34:49.881 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:34:49.881 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:34:49.881 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:34:50.142 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:34:50.142 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:34:50.402 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:34:50.402 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:34:50.402 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:34:50.662 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:34:50.662 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:34:50.662 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:34:50.922 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:34:50.922 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:34:55.124 Cleaning 00:34:55.124 Removing: /var/run/dpdk/spdk0/config 00:34:55.124 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:34:55.125 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:34:55.125 Removing: /var/run/dpdk/spdk0/hugepage_info 00:34:55.125 Removing: /dev/shm/nvmf_trace.0 00:34:55.125 Removing: /dev/shm/spdk_tgt_trace.pid2452080 00:34:55.125 Removing: /var/run/dpdk/spdk0 00:34:55.125 Removing: /var/run/dpdk/spdk_pid2447508 00:34:55.125 Removing: /var/run/dpdk/spdk_pid2449845 00:34:55.125 Removing: /var/run/dpdk/spdk_pid2452080 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2452590 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2453530 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2453821 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2454791 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2455003 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2455215 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2458452 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2460382 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2460738 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2461089 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2461419 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2461649 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2461872 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2462237 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2462658 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2463622 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2467064 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2467282 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2467482 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2467802 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2467848 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2468185 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2468298 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2468536 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2468860 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2469177 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2469367 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2469553 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2469853 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2470174 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2470462 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2470569 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2470852 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2471167 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2471456 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2471569 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2471848 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2472168 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2472488 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2472690 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2472863 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2473163 00:34:55.384 Removing: /var/run/dpdk/spdk_pid2473481 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2473808 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2474135 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2474460 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2474783 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2475113 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2475438 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2475764 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2475831 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2476209 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2476638 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2476969 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2477183 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2481294 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2483592 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2485622 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2486577 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2488100 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2488425 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2488457 00:34:55.385 Removing: /var/run/dpdk/spdk_pid2488480 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2493188 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2493808 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2495037 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2495422 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2501597 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2503546 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2504475 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2508902 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2510659 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2511668 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2516507 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2518984 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2520606 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2531729 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2534403 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2535552 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2547419 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2549879 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2551063 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2561663 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2565465 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2567077 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2578951 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2582307 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2583348 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2595495 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2598239 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2599411 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2612018 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2616200 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2617498 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2618536 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2622158 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2628502 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2632412 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2638216 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2642447 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2648996 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2652002 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2660285 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2662789 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2670095 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2672623 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2679157 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2681822 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2686931 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2687250 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2687653 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2688173 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2688594 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2689548 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2690304 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2690706 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2692808 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2694757 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2696906 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2698899 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2701343 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2703396 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2705541 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2707281 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2707968 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2708402 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2710807 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2713171 00:34:55.646 Removing: /var/run/dpdk/spdk_pid2715633 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2716904 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2718330 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2719046 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2719085 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2719150 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2719479 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2719515 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2720790 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2722623 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2724585 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2725522 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2726495 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2726782 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2726884 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2727068 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2728086 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2728834 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2729453 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2732589 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2734887 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2737174 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2738428 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2739943 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2740573 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2740615 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2745206 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2745355 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2745573 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2745700 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2745941 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2746608 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2747572 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2748801 00:34:55.907 Removing: /var/run/dpdk/spdk_pid2750047 00:34:55.907 Clean 00:34:55.907 16:11:16 -- common/autotest_common.sh@1451 -- # return 0 00:34:55.907 16:11:16 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:34:55.907 16:11:16 -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:55.907 16:11:16 -- common/autotest_common.sh@10 -- # set +x 00:34:56.168 16:11:16 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:34:56.168 16:11:16 -- common/autotest_common.sh@728 -- # xtrace_disable 00:34:56.168 16:11:16 -- common/autotest_common.sh@10 -- # set +x 00:34:56.168 16:11:16 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:34:56.168 16:11:16 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:34:56.168 16:11:16 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:34:56.168 16:11:16 -- spdk/autotest.sh@391 -- # hash lcov 00:34:56.168 16:11:16 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:34:56.168 16:11:16 -- spdk/autotest.sh@393 -- # hostname 00:34:56.168 16:11:16 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-CYP-06 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:34:56.168 geninfo: WARNING: invalid characters removed from testname! 00:35:18.122 16:11:38 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:20.655 16:11:41 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:23.201 16:11:43 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:25.177 16:11:45 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:27.089 16:11:47 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:28.999 16:11:49 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:31.533 16:11:51 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:35:31.533 16:11:51 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:31.533 16:11:51 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:35:31.533 16:11:51 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:31.533 16:11:51 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:31.533 16:11:51 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:31.533 16:11:51 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:31.533 16:11:51 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:31.533 16:11:51 -- paths/export.sh@5 -- $ export PATH 00:35:31.533 16:11:51 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:31.533 16:11:51 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:31.533 16:11:51 -- common/autobuild_common.sh@444 -- $ date +%s 00:35:31.533 16:11:51 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1720793511.XXXXXX 00:35:31.533 16:11:51 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1720793511.oBrvEM 00:35:31.533 16:11:51 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:35:31.533 16:11:51 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:35:31.533 16:11:51 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:35:31.533 16:11:51 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:35:31.533 16:11:51 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:35:31.533 16:11:51 -- common/autobuild_common.sh@460 -- $ get_config_params 00:35:31.533 16:11:51 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:35:31.533 16:11:51 -- common/autotest_common.sh@10 -- $ set +x 00:35:31.533 16:11:51 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:35:31.533 16:11:51 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:35:31.533 16:11:51 -- pm/common@17 -- $ local monitor 00:35:31.533 16:11:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:31.533 16:11:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:31.533 16:11:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:31.533 16:11:51 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:31.533 16:11:51 -- pm/common@21 -- $ date +%s 00:35:31.533 16:11:51 -- pm/common@25 -- $ sleep 1 00:35:31.533 16:11:51 -- pm/common@21 -- $ date +%s 00:35:31.533 16:11:51 -- pm/common@21 -- $ date +%s 00:35:31.533 16:11:51 -- pm/common@21 -- $ date +%s 00:35:31.533 16:11:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720793511 00:35:31.533 16:11:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720793511 00:35:31.533 16:11:51 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720793511 00:35:31.533 16:11:51 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1720793511 00:35:31.533 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720793511_collect-vmstat.pm.log 00:35:31.533 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720793511_collect-cpu-load.pm.log 00:35:31.533 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720793511_collect-cpu-temp.pm.log 00:35:31.533 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1720793511_collect-bmc-pm.bmc.pm.log 00:35:32.105 16:11:52 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:35:32.105 16:11:52 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j128 00:35:32.105 16:11:52 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:32.105 16:11:52 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:35:32.105 16:11:52 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:35:32.105 16:11:52 -- spdk/autopackage.sh@19 -- $ timing_finish 00:35:32.105 16:11:52 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:35:32.105 16:11:52 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:35:32.105 16:11:52 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:32.365 16:11:52 -- spdk/autopackage.sh@20 -- $ exit 0 00:35:32.365 16:11:52 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:35:32.365 16:11:52 -- pm/common@29 -- $ signal_monitor_resources TERM 00:35:32.365 16:11:52 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:35:32.365 16:11:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:32.365 16:11:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:35:32.365 16:11:52 -- pm/common@44 -- $ pid=2763028 00:35:32.365 16:11:52 -- pm/common@50 -- $ kill -TERM 2763028 00:35:32.365 16:11:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:32.365 16:11:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:35:32.365 16:11:52 -- pm/common@44 -- $ pid=2763029 00:35:32.366 16:11:52 -- pm/common@50 -- $ kill -TERM 2763029 00:35:32.366 16:11:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:32.366 16:11:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:35:32.366 16:11:52 -- pm/common@44 -- $ pid=2763031 00:35:32.366 16:11:52 -- pm/common@50 -- $ kill -TERM 2763031 00:35:32.366 16:11:52 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:32.366 16:11:52 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:35:32.366 16:11:52 -- pm/common@44 -- $ pid=2763054 00:35:32.366 16:11:52 -- pm/common@50 -- $ sudo -E kill -TERM 2763054 00:35:32.366 + [[ -n 2322349 ]] 00:35:32.366 + sudo kill 2322349 00:35:32.378 [Pipeline] } 00:35:32.404 [Pipeline] // stage 00:35:32.413 [Pipeline] } 00:35:32.434 [Pipeline] // timeout 00:35:32.440 [Pipeline] } 00:35:32.458 [Pipeline] // catchError 00:35:32.464 [Pipeline] } 00:35:32.541 [Pipeline] // wrap 00:35:32.545 [Pipeline] } 00:35:32.555 [Pipeline] // catchError 00:35:32.563 [Pipeline] stage 00:35:32.564 [Pipeline] { (Epilogue) 00:35:32.573 [Pipeline] catchError 00:35:32.575 [Pipeline] { 00:35:32.584 [Pipeline] echo 00:35:32.585 Cleanup processes 00:35:32.590 [Pipeline] sh 00:35:32.874 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:32.874 2763134 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:35:32.874 2763528 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:32.888 [Pipeline] sh 00:35:33.174 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:33.174 ++ grep -v 'sudo pgrep' 00:35:33.174 ++ awk '{print $1}' 00:35:33.174 + sudo kill -9 2763134 00:35:33.186 [Pipeline] sh 00:35:33.470 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:35:45.713 [Pipeline] sh 00:35:46.008 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:35:46.008 Artifacts sizes are good 00:35:46.027 [Pipeline] archiveArtifacts 00:35:46.036 Archiving artifacts 00:35:46.204 [Pipeline] sh 00:35:46.517 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:35:46.532 [Pipeline] cleanWs 00:35:46.542 [WS-CLEANUP] Deleting project workspace... 00:35:46.542 [WS-CLEANUP] Deferred wipeout is used... 00:35:46.549 [WS-CLEANUP] done 00:35:46.551 [Pipeline] } 00:35:46.571 [Pipeline] // catchError 00:35:46.583 [Pipeline] sh 00:35:46.865 + logger -p user.info -t JENKINS-CI 00:35:46.875 [Pipeline] } 00:35:46.892 [Pipeline] // stage 00:35:46.898 [Pipeline] } 00:35:46.916 [Pipeline] // node 00:35:46.922 [Pipeline] End of Pipeline 00:35:46.954 Finished: SUCCESS